我想使用Sqoop从Postgres数据库中提取数据,我使用Google Dataproc来执行Sqoop。但是,当我提交Sqoop作业时出现错误。
我使用以下命令:
使用1.3.24-deb9映像版本创建集群
gcloud dataproc clusters create <CLUSTER_NAME> \
--region=asia-southeast1 --zone=asia-southeast1-a \
--properties=hive:hive.metastore.warehouse.dir=gs://<BUCKET>/hive-warehouse \
--master-boot-disk-size=100
提交一份工作
gcloud dataproc jobs submit hadoop --cluster=<CLUSTER_NAME> \
--region=asia-southeast1 \
--class=org.apache.sqoop.Sqoop \
--jars=gs://<BUCKET>/sqoop-1.4.7-hadoop260.jar,gs://<BUCKET>/avro-tools-1.8.2.jar,gs://<BUCKET>/postgresql-42.2.5.jar \
-- \
import -Dmapreduce.job.user.classpath.first=true \
--connect=jdbc:postgresql://<HOST>:5432/<DATABASE> \
--username=<USER> \
--password-file=gs://BUCKET/pass.txt \
--target-dir=gs://<BUCKET>/<OUTPUT> \
--table=<TABLE> \
--as-avrodatafile
错误
19/02/26 04:52:38 INFO mapreduce.Job: Running job: job_1551156514661_0001
19/02/26 04:52:48 INFO mapreduce.Job: Job job_xxx_0001 running in uber mode : false
19/02/26 04:52:48 INFO mapreduce.Job: map 0% reduce 0%
19/02/26 04:52:48 INFO mapreduce.Job: Job job_xxx_0001 failed with state FAILED due to: Application application_xxx_0001 failed 2 times due to AM Container for appattempt_xxx_0001_000002 exited with exitCode: 1
Failing this attempt.Diagnostics: [2019-02-26 04:52:47.771]Exception from container-launch.
Container id: container_xxx_0001_02_000001
Exit code: 1
[2019-02-26 04:52:47.779]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
log4j:WARN No such property [containerLogFile] in org.apache.hadoop.yarn.ContainerLogAppender.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/nm-local-dir/usercache/root/filecache/10/libjars/avro-tools-1.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
[2019-02-26 04:52:47.780]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
log4j:WARN No such property [containerLogFile] in org.apache.hadoop.yarn.ContainerLogAppender.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/nm-local-dir/usercache/root/filecache/10/libjars/avro-tools-1.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
问题可能出在Dataproc的Hadoop(Avro 1.7.7)和Sqoop 1.4.7(Avro 1.8.1)中的不同Avro版本中。
您可能想尝试将Sqoop降级到1.4.6,这取决于Avro 1.7并在作业提交期间使用avro-tools-1.7.7.jar
。