Sqoop 导出作业失败
Sqoop export job failed
无法将 HDFS 内容导出到 oracle DB。
甲骨文:
create table DB1.T1 (
id1 number,
id2 number
);
蜂巢:
create table DB1.T1 (
id1 int,
id2 int
);
insert into table values(0,0);
Sqoop:
$ sqoop export \
--connect driver:@ip:port:DB \
--username=DB --password 'bad_practice_pwd' \
-m 1 \
--export-dir "/user/hive/warehouse/DB1.db/T1/file" \
--table DB1.T1
--direct
错误:
18/04/16 17:11:00 INFO mapreduce.Job: Job job_1520336080249_0240 failed with state FAILED due to: Task failed task_1520336080249_0240_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
18/04/16 17:11:00 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=1
Launched map tasks=1
Rack-local map tasks=1
Total time spent by all maps in occupied slots (ms)=4872
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=4872
Total vcore-milliseconds taken by all map tasks=4872
Total megabyte-milliseconds taken by all map tasks=4988928 18/04/16 17:11:00 WARN mapreduce.Counters: Group
FileSystemCounters is deprecated. Use
org.apache.hadoop.mapreduce.FileSystemCounter instead 18/04/16
17:11:00 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 16.9653
seconds (0 bytes/sec) 18/04/16 17:11:00 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead 18/04/16 17:11:00 INFO
mapreduce.ExportJobBase: Exported 0 records. 18/04/16 17:11:00 ERROR
tool.ExportTool: Error during export: Export job failed!
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:465)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
已将 Hive table 定义为:
create table DB1.T1 (
id1 int,
id2 int
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n;
导出完成。
18/04/18 13:09:11 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=175430
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=189
HDFS: Number of bytes written=0
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=0
Job Counters
Launched map tasks=1
Rack-local map tasks=1
Total time spent by all maps in occupied slots (ms)=2747
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=2747
Total vcore-milliseconds taken by all map tasks=2747
Total megabyte-milliseconds taken by all map tasks=2812928
Map-Reduce Framework
Map input records=1
Map output records=1
Input split bytes=182
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=47
CPU time spent (ms)=1620
Physical memory (bytes) snapshot=359587840
Virtual memory (bytes) snapshot=2823344128
Total committed heap usage (bytes)=619184128
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=0
18/04/18 13:09:11 INFO mapreduce.ExportJobBase: Transferred 189 bytes in 13.8196 seconds (13.6762 bytes/sec)
18/04/18 13:09:11 INFO mapreduce.ExportJobBase: Exported 1 records.
我有一个类似的问题,当我缩小配置单元存储文件的位置时,我发现空值被转换为 '\n' 但是目标 MySQL 列期望 int 值,我必须从表中删除这些值,然后问题得到解决。
无法将 HDFS 内容导出到 oracle DB。
甲骨文:
create table DB1.T1 (
id1 number,
id2 number
);
蜂巢:
create table DB1.T1 (
id1 int,
id2 int
);
insert into table values(0,0);
Sqoop:
$ sqoop export \
--connect driver:@ip:port:DB \
--username=DB --password 'bad_practice_pwd' \
-m 1 \
--export-dir "/user/hive/warehouse/DB1.db/T1/file" \
--table DB1.T1
--direct
错误:
18/04/16 17:11:00 INFO mapreduce.Job: Job job_1520336080249_0240 failed with state FAILED due to: Task failed task_1520336080249_0240_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0
18/04/16 17:11:00 INFO mapreduce.Job: Counters: 8 Job Counters Failed map tasks=1 Launched map tasks=1 Rack-local map tasks=1 Total time spent by all maps in occupied slots (ms)=4872 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=4872 Total vcore-milliseconds taken by all map tasks=4872 Total megabyte-milliseconds taken by all map tasks=4988928 18/04/16 17:11:00 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 18/04/16 17:11:00 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 16.9653 seconds (0 bytes/sec) 18/04/16 17:11:00 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 18/04/16 17:11:00 INFO mapreduce.ExportJobBase: Exported 0 records. 18/04/16 17:11:00 ERROR tool.ExportTool: Error during export: Export job failed! at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439) at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:465) at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80) at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
已将 Hive table 定义为:
create table DB1.T1 (
id1 int,
id2 int
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n;
导出完成。
18/04/18 13:09:11 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=175430 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=189 HDFS: Number of bytes written=0 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Job Counters Launched map tasks=1 Rack-local map tasks=1 Total time spent by all maps in occupied slots (ms)=2747 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=2747 Total vcore-milliseconds taken by all map tasks=2747 Total megabyte-milliseconds taken by all map tasks=2812928 Map-Reduce Framework Map input records=1 Map output records=1 Input split bytes=182 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=47 CPU time spent (ms)=1620 Physical memory (bytes) snapshot=359587840 Virtual memory (bytes) snapshot=2823344128 Total committed heap usage (bytes)=619184128 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=0 18/04/18 13:09:11 INFO mapreduce.ExportJobBase: Transferred 189 bytes in 13.8196 seconds (13.6762 bytes/sec) 18/04/18 13:09:11 INFO mapreduce.ExportJobBase: Exported 1 records.
我有一个类似的问题,当我缩小配置单元存储文件的位置时,我发现空值被转换为 '\n' 但是目标 MySQL 列期望 int 值,我必须从表中删除这些值,然后问题得到解决。