SQOOP导入失败,找不到文件异常

SQOOP Import Fails, File Not Found Exception

我是 hadoop 架构系统的新手,使用网络搜索安装组件。为此,我安装了 Hadoop、sqoop、hive。这是我安装的目录结构(我的本地 ubuntu 机器和任何虚拟机,我的每个安装都在单独的目录中):-

通过查看错误,我试图解决它,所以我将 sqoop(本地计算机 /usr/local/sqoop)文件夹复制到 hdfs 目录(hdfs://localhost:54310/usr/local/sqoop ).这解决了我的问题。我想从中了解一些事情:-

16/07/02 13:22:15 ERROR tool.ImportTool: Encountered IOException running import job: java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/usr/local/sqoop/lib/avro-mapred-1.7.5-hadoop2.jar at org.apache.hadoop.hdfs.DistributedFileSystem.doCall(DistributedFileSystem.java:1122) at org.apache.hadoop.hdfs.DistributedFileSystem.doCall(DistributedFileSystem.java:1114) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:269) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:390) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:483) at org.apache.hadoop.mapreduce.Job.run(Job.java:1296) at org.apache.hadoop.mapreduce.Job.run(Job.java:1293) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314) at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

安装没问题,不需要把sqoop目录下的所有文件都拷贝过来,只需要将sqoop库文件拷贝到hdfs即可。

  • 在hdfs中创建与$SQOOP_HOME/lib.

    相同的目录结构

    示例:hdfs dfs -mkdir -p /usr/lib/sqoop

  • 将所有sqoop库文件从$SQOOP_HOME/lib复制到hdfs lib

    示例:hdfs dfs -put /usr/lib/sqoop/lib/* /usr/lib/sqoop/lib