当前位置: 首页 > 文档资料 > 大数据实验手册 >

第六章 Hive - 10 FAQ

优质
小牛编辑
127浏览
2023-12-01

FAQ

调试中出现的Jline版本过低的FAQ

  1. Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
  2. SLF4J: Class path contains multiple SLF4J bindings.
  3. SLF4J: Found binding in [jar:file:/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  4. SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  5. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  6. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  7. [ERROR] Terminal initialization failed; falling back to unsupported
  8. java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
  9. at jline.TerminalFactory.create(TerminalFactory.java:101)
  10. at jline.TerminalFactory.get(TerminalFactory.java:158)
  11. at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
  12. at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
  13. at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
  14. at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
  15. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
  16. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
  17. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
  18. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  19. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  20. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  21. at java.lang.reflect.Method.invoke(Method.java:606)
  22. at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
  23. 原因是hadoop目录下存在老版本jline:
  24. /hadoop-2.5.2/share/hadoop/yarn/lib:
  25. -rw-r--r-- 1 root root 87325 Mar 10 18:10 jline-0.9.94.jar
  26. 解决方法是:
  27. 将hive下的新版本jline的JAR包拷贝到hadoop下:
  28. cp /hive/apache-hive-1.1.0-bin/lib/jline-2.12.jar ./
  29. /hadoop-2.5.2/share/hadoop/yarn/lib:
  30. -rw-r--r-- 1 root root 87325 Mar 10 18:10 jline-0.9.94.jar.bak
  31. -rw-r--r-- 1 root root 213854 Mar 11 22:22 jline-2.12.jar
  32. hive cli启动成功:
  33. root@ubuntu:/hive# hive
  34. Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
  35. SLF4J: Class path contains multiple SLF4J bindings.
  36. SLF4J: Found binding in [jar:file:/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  37. SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  38. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  39. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  40. hive>

调试中出现java.io.tmpdir目录的问题

  1. 异常详情如下:
  2. Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  3. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
  4. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:672)
  5. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
  6. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  7. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  8. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  9. at java.lang.reflect.Method.invoke(Method.java:606)
  10. at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
  11. Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  12. at org.apache.hadoop.fs.Path.initialize(Path.java:148)
  13. at org.apache.hadoop.fs.Path.<init>(Path.java:126)
  14. at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:487)
  15. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430)
  16. ... 7 more
  17. Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  18. at java.net.URI.checkPath(URI.java:1804)
  19. at java.net.URI.<init>(URI.java:752)
  20. at org.apache.hadoop.fs.Path.initialize(Path.java:145)
  21. ... 10 more
  22. 解决方案如下:
  23. 1.查看hive-site.xml配置,会看到配置值含有"system:java.io.tmpdir"的配置项
  24. 2.新建文件夹/home/grid/hive-0.14.0-bin/iotmp
  25. 3.将含有"system:java.io.tmpdir"的配置项的值修改为如上地址
  26. 启动hive,成功!

hive内存不够用的问题

  1. hive> select * from t_test where ds=20150323 limit 2;
  2. OK
  3. Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
  4. 问题原因: hive堆内存默认为256M
  5. 这个问题的解决方法为:
  6. 修改/usr/lib/hive/bin/hive-config.sh文件 中
  7. # Default to use 256MB
  8. export HADOOP_HEAPSIZE=${HADOOP_HEAPSIZE:-256}
  9. 将上面256调大就行

用户不对的错误

编写JDBC客户端程序连接hive时,出现报错:

  1. org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate anonymous

具体出错信息

  1. Exception in thread "main" org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate anonymous
  2. at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:258)
  3. at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:249)
  4. at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:579)
  5. at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
  6. at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
  7. at java.sql.DriverManager.getConnection(DriverManager.java:571)
  8. at java.sql.DriverManager.getConnection(DriverManager.java:215)
  9. at client.main(client.java:21)
  10. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  11. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  12. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  13. at java.lang.reflect.Method.invoke(Method.java:606)
  14. at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
  15. Caused by: org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate anonymous
  16. at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:324)
  17. at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:187)
  18. at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:424)
  19. at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:318)
  20. at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1257)
  21. at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1242)
  22. at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
  23. at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
  24. at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
  25. at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
  26. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  27. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  28. at java.lang.Thread.run(Thread.java:745)
  29. Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate anonymous
  30. at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89)
  31. at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
  32. at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
  33. at java.security.AccessController.doPrivileged(Native Method)
  34. at javax.security.auth.Subject.doAs(Subject.java:422)
  35. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
  36. at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
  37. at com.sun.proxy.$Proxy35.open(Unknown Source)
  38. at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:315)
  39. ... 12 more
  40. Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate anonymous
  41. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:554)
  42. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:489)
  43. at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:156)
  44. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  45. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  46. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  47. at java.lang.reflect.Method.invoke(Method.java:497)
  48. at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
  49. ... 20 more
  50. Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException:User: hadoop is not allowed to impersonate anonymous
  51. at org.apache.hadoop.ipc.Client.call(Client.java:1476)
  52. at org.apache.hadoop.ipc.Client.call(Client.java:1407)
  53. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
  54. at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source)
  55. at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
  56. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  57. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  58. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  59. at java.lang.reflect.Method.invoke(Method.java:497)
  60. at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  61. at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  62. at com.sun.proxy.$Proxy31.getFileInfo(Unknown Source)
  63. at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
  64. at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
  65. at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
  66. at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
  67. at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
  68. at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
  69. at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:639)
  70. at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:597)
  71. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526)
  72. ... 27 more

从最终的错误信息来看:User: hadoop is not allowed to impersonate anonymous,意思是用户hadoop不允许伪装成anonymous(hive的默认用户,默认配置可以查看)。

解决方案

  1. <property>
  2. <name>hadoop.proxyuser.hadoop.groups</name>
  3. <value>hadoop</value>
  4. <description>Allow the superuser oozie to impersonate any members of the group group1 and group2</description>
  5. </property>
  6. <property>
  7. <name>hadoop.proxyuser.hadoop.hosts</name>
  8. <value>192.168.21.222,127.0.0.1,localhost</value>
  9. <description>The superuser can connect only from host1 and host2 to impersonate a user</description>
  10. </property>