博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
sqoop从hive导出到mysql报错 ERROR mapreduce.ExportJobBase: Export job failed!
阅读量:3958 次
发布时间:2019-05-24

本文共 2497 字,大约阅读时间需要 8 分钟。

1,错误日志关键如下

19/05/17 11:48:15 INFO mapreduce.Job: Running job: job_1558105459102_000719/05/17 11:48:42 INFO mapreduce.Job: Job job_1558105459102_0007 running in uber mode : false19/05/17 11:48:42 INFO mapreduce.Job:  map 0% reduce 0%19/05/17 11:49:25 INFO mapreduce.Job:  map 80% reduce 0%19/05/17 11:49:27 INFO mapreduce.Job:  map 100% reduce 0%19/05/17 11:49:28 INFO mapreduce.Job: Job job_1558105459102_0007 failed with state FAILED due to: Task failed task_1558105459102_0007_m_000003Job failed as tasks failed. failedMaps:1 failedReduces:019/05/17 11:49:28 INFO mapreduce.Job: Counters: 12        Job Counters                Failed map tasks=4                Killed map tasks=1                Launched map tasks=5                Rack-local map tasks=5                Total time spent by all maps in occupied slots (ms)=201601                Total time spent by all reduces in occupied slots (ms)=0                Total time spent by all map tasks (ms)=201601                Total vcore-seconds taken by all map tasks=201601                Total megabyte-seconds taken by all map tasks=206439424        Map-Reduce Framework                CPU time spent (ms)=0                Physical memory (bytes) snapshot=0                Virtual memory (bytes) snapshot=019/05/17 11:49:28 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead19/05/17 11:49:28 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 79.8861 seconds (0 bytes/sec)19/05/17 11:49:28 INFO mapreduce.ExportJobBase: Exported 0 records.19/05/17 11:49:28 ERROR mapreduce.ExportJobBase: Export job failed!19/05/17 11:49:28 ERROR tool.ExportTool: Error during export:Export job failed!        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)        at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

2,发现map都跑完了,然而最后却一直报错,却又一直找不到错误的关键信息,排查了好久发现是mysql创建对应表的字段的长度不够。

hive的表中有一个职业名称的字段没想到长度达到了47,
3,解决方案,修改mysql表的相应字段的长度,最后完美导入。

转自  

转载地址:http://faazi.baihongyu.com/

你可能感兴趣的文章