当前位置: 首页 > 知识库问答 >
问题:

使用sqoop将数据从postgres导入配置单元

公良运锋
2023-03-14

我想将数据从postgres导入到配置单元,然后输入以下命令:

sqoop import --connect jdbc:postgresql://localhost:5432/ --username postgres --password postgres --table users --hive-import --m 1

但我看到这条失败的消息:

Warning: /usr/local/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2366: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: invalid variable name
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2461: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: invalid variable name
2020-11-16 09:12:43,658 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2020-11-16 09:12:43,711 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2020-11-16 09:12:43,711 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
2020-11-16 09:12:43,711 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
2020-11-16 09:12:43,779 INFO manager.SqlManager: Using default fetchSize of 1000
2020-11-16 09:12:43,780 INFO tool.CodeGenTool: Beginning code generation
2020-11-16 09:12:43,981 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM "users" AS t LIMIT 1
2020-11-16 09:12:44,009 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-hadoop/compile/1de46ca6c2305faed7095f3728a74afc/users.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2020-11-16 09:12:44,665 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/1de46ca6c2305faed7095f3728a74afc/users.jar
2020-11-16 09:12:44,747 WARN manager.PostgresqlManager: It looks like you are importing from postgresql.
2020-11-16 09:12:44,747 WARN manager.PostgresqlManager: This transfer can be faster! Use the --direct
2020-11-16 09:12:44,747 WARN manager.PostgresqlManager: option to exercise a postgresql-specific fast path.
2020-11-16 09:12:44,751 INFO mapreduce.ImportJobBase: Beginning import of users
2020-11-16 09:12:44,751 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2020-11-16 09:12:44,820 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
2020-11-16 09:12:45,145 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2020-11-16 09:12:45,205 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
2020-11-16 09:12:45,923 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1605504371417_0002
2020-11-16 09:12:46,471 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:46,978 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:47,266 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:48,045 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:48,440 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:48,830 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:49,190 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:49,522 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:49,903 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:50,726 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:51,060 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:51,449 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:51,816 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:52,186 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:52,974 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:53,362 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:53,651 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:54,063 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:54,419 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:54,820 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:55,873 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:56,231 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:56,643 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:56,921 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:57,722 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:58,122 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:58,911 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:59,690 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:00,045 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:00,435 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:00,890 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:01,202 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:01,569 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:01,937 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:02,327 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:02,617 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:02,973 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:03,350 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:03,717 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:04,540 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:04,917 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:05,203 INFO db.DBInputFormat: Using read commited transaction isolation
2020-11-16 09:13:06,286 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:06,675 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:07,163 INFO mapreduce.JobSubmitter: number of splits:1
2020-11-16 09:13:07,565 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:07,661 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1605504371417_0002
2020-11-16 09:13:07,661 INFO mapreduce.JobSubmitter: Executing with tokens: []
2020-11-16 09:13:07,858 INFO conf.Configuration: resource-types.xml not found
2020-11-16 09:13:07,858 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2020-11-16 09:13:07,926 INFO impl.YarnClientImpl: Submitted application application_1605504371417_0002
2020-11-16 09:13:07,968 INFO mapreduce.Job: The url to track the job: http://alim-VirtualBox:8088/proxy/application_1605504371417_0002/
2020-11-16 09:13:07,968 INFO mapreduce.Job: Running job: job_1605504371417_0002
2020-11-16 09:13:12,079 INFO mapreduce.Job: Job job_1605504371417_0002 running in uber mode : false
2020-11-16 09:13:12,082 INFO mapreduce.Job:  map 0% reduce 0%
2020-11-16 09:13:16,147 INFO mapreduce.Job:  map 100% reduce 0%
2020-11-16 09:13:19,246 INFO mapreduce.Job: Job job_1605504371417_0002 completed successfully
2020-11-16 09:13:19,306 INFO mapreduce.Job: Counters: 33
    File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=234905
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=87
        HDFS: Number of bytes written=54
        HDFS: Number of read operations=6
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
        HDFS: Number of bytes read erasure-coded=0
    Job Counters 
        Launched map tasks=1
        Other local map tasks=1
        Total time spent by all maps in occupied slots (ms)=2231
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=2231
        Total vcore-milliseconds taken by all map tasks=2231
        Total megabyte-milliseconds taken by all map tasks=2284544
    Map-Reduce Framework
        Map input records=3
        Map output records=3
        Input split bytes=87
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=26
        CPU time spent (ms)=770
        Physical memory (bytes) snapshot=215732224
        Virtual memory (bytes) snapshot=2561839104
        Total committed heap usage (bytes)=200802304
        Peak Map Physical memory (bytes)=215732224
        Peak Map Virtual memory (bytes)=2561839104
    File Input Format Counters 
        Bytes Read=0
    File Output Format Counters 
        Bytes Written=54
2020-11-16 09:13:19,309 INFO mapreduce.ImportJobBase: Transferred 54 bytes in 34.1584 seconds (1.5809 bytes/sec)
2020-11-16 09:13:19,316 INFO mapreduce.ImportJobBase: Retrieved 3 records.
2020-11-16 09:13:19,316 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table users
2020-11-16 09:13:19,343 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM "users" AS t LIMIT 1
2020-11-16 09:13:19,353 INFO hive.HiveImport: Loading uploaded data into Hive
2020-11-16 09:13:19,360 INFO conf.HiveConf: Found configuration file file:/usr/local/hive/conf/hive-site.xml
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2020-11-16 09:13:20,277 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2020-11-16 09:13:21,685 INFO hive.HiveImport: Hive Session ID = c35a4fbf-8b8b-488c-838f-68711d017e49
2020-11-16 09:13:21,726 INFO hive.HiveImport: 
2020-11-16 09:13:21,727 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
2020-11-16 09:15:05,415 INFO hive.HiveImport: FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
2020-11-16 09:15:58,418 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive exited with status 64
    at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

问题是什么?我如何修复这个故障???

共有1个答案

裴泰平
2023-03-14

我检查了postgresql数据类型,并更改了配置单元不支持的一些数据类型!!!!

 类似资料:
  • 我试图使用将数据从复制到。但是,尽管我在文件中设置了变量,但仍然出现了以下错误。下面是我的代码: bashrc文件中的变量: 错误: 我还需要在这里添加/修改什么??

  • 如何在压缩表中实现从Oracle到配置单元的Sqoop导入 配置单元:创建外部表xx.tmp_member3(a字符串、kpi_name字符串、b字符串、c字符串),由(YEAR INT,MONTH INT,DAY INT)行格式分隔字段以“”终止,“存储为ORC位置”/ENVIR./2019/4/20190416"TBLPROPERTIES(“ORC.compress”=“bzip2”); 我已

  • 我是AVRO的新手,我正在尝试将AVRO格式的数据从SQL Server导入到HDFS。

  • 错误:java.io.ioException:无法导出数据,请在org.apache.sqoop.mapreduce.textexportMapper.map(textexportMapper.java:112)在org.apache.sqoop.mapreduce.textexportMapper.map(textexportMapper.java:39)在org.apache.sqoop.ma

  • 我在命令行中使用配置单元导入选项执行了Sqoop作业,我知道问题出在哪里。在命令行中,我可以看到以下信息: 问题在于访问位于本地文件系统上的hive-common-1.2.1000.2.4.2.0-jar。你知道我该怎么办吗?

  • 我无法使用直线接口将数据从配置单元表提取到文件。与配置单元接口一起工作的东西在beeline上不工作。感谢任何帮助。