HDFS + 无法将文件从 HDFS 复制到本地文件夹
HDFS + cant copy file from HDFS to local folder
我们正在尝试将文件从 /hdp/apps/2.6.5.0-292/hive/hive.tar.gz
复制到本地文件夹 /var/tmp
如我们所见,我们得到 hdfs.DFSClient: Could not obtain
和 No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
最后文件没有复制到本地文件夹 - /var/tmp
我们也尝试将 /hdp/apps/2.6.5.0-292
下的其他文件复制到本地文件夹 - /var/tmp
但我们得到了同样的错误
知道这个问题的原因是什么吗?
注意 - 我们检查了 ambari 的 HDFS 健康检查,HDFS 没问题
hdfs dfs -copyToLocal /hdp/apps/2.6.5.0-292/hive/hive.tar.gz /var/tmp
20/08/04 09:07:12 INFO hdfs.DFSClient: No node available for BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
20/08/04 09:07:12 INFO hdfs.DFSClient: Could not obtain BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 from any node: java.io.IOException: No live nodes contain block BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 after checking nodes = [], ignoredNodes = null No live nodes contain current block Block locations: Dead nodes: . Will get new block locations from namenode and retry...
20/08/04 09:07:12 WARN hdfs.DFSClient: DFS chooseDataNode: got # 1 IOException, will wait for 916.7101213444472 msec.
20/08/04 09:07:12 INFO hdfs.DFSClient: No node available for BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
20/08/04 09:07:12 INFO hdfs.DFSClient: Could not obtain BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 from any node: java.io.IOException: No live nodes contain block BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 after checking nodes = [], ignoredNodes = null No live nodes contain current block Block locations: Dead nodes: . Will get new block locations from namenode and retry...
20/08/04 09:07:12 WARN hdfs.DFSClient: DFS chooseDataNode: got # 2 IOException, will wait for 8364.841990287568 msec.
20/08/04 09:07:21 INFO hdfs.DFSClient: No node available for BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
20/08/04 09:07:21 INFO hdfs.DFSClient: Could not obtain BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 from any node: java.io.IOException: No live nodes contain block BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 after checking nodes = [], ignoredNodes = null No live nodes contain current block Block locations: Dead nodes: . Will get new block locations from namenode and retry...
20/08/04 09:07:21 WARN hdfs.DFSClient: DFS chooseDataNode: got # 3 IOException, will wait for 14554.977191829808 msec.
20/08/04 09:07:35 WARN hdfs.DFSClient: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
20/08/04 09:07:35 WARN hdfs.DFSClient: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
20/08/04 09:07:35 WARN hdfs.DFSClient: DFS Read
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:995)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:638)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:888)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:945)
at java.io.DataInputStream.read(DataInputStream.java:100)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:88)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:62)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:122)
at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:467)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:392)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:329)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:264)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:249)
at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317)
at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:244)
at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)
at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)
at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:221)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:356)
copyToLocal: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
请运行执行以下命令并检查块是否已损坏?
hdfs fsck /
如果有损坏的块,那么您可能需要执行恢复过程。
对于恢复,您可以按照 link
https://blog.cloudera.com/understanding-hdfs-recovery-processes-part-1/
我们正在尝试将文件从 /hdp/apps/2.6.5.0-292/hive/hive.tar.gz
复制到本地文件夹 /var/tmp
如我们所见,我们得到 hdfs.DFSClient: Could not obtain
和 No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
最后文件没有复制到本地文件夹 - /var/tmp
我们也尝试将 /hdp/apps/2.6.5.0-292
下的其他文件复制到本地文件夹 - /var/tmp
但我们得到了同样的错误
知道这个问题的原因是什么吗?
注意 - 我们检查了 ambari 的 HDFS 健康检查,HDFS 没问题
hdfs dfs -copyToLocal /hdp/apps/2.6.5.0-292/hive/hive.tar.gz /var/tmp
20/08/04 09:07:12 INFO hdfs.DFSClient: No node available for BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
20/08/04 09:07:12 INFO hdfs.DFSClient: Could not obtain BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 from any node: java.io.IOException: No live nodes contain block BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 after checking nodes = [], ignoredNodes = null No live nodes contain current block Block locations: Dead nodes: . Will get new block locations from namenode and retry...
20/08/04 09:07:12 WARN hdfs.DFSClient: DFS chooseDataNode: got # 1 IOException, will wait for 916.7101213444472 msec.
20/08/04 09:07:12 INFO hdfs.DFSClient: No node available for BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
20/08/04 09:07:12 INFO hdfs.DFSClient: Could not obtain BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 from any node: java.io.IOException: No live nodes contain block BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 after checking nodes = [], ignoredNodes = null No live nodes contain current block Block locations: Dead nodes: . Will get new block locations from namenode and retry...
20/08/04 09:07:12 WARN hdfs.DFSClient: DFS chooseDataNode: got # 2 IOException, will wait for 8364.841990287568 msec.
20/08/04 09:07:21 INFO hdfs.DFSClient: No node available for BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
20/08/04 09:07:21 INFO hdfs.DFSClient: Could not obtain BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 from any node: java.io.IOException: No live nodes contain block BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 after checking nodes = [], ignoredNodes = null No live nodes contain current block Block locations: Dead nodes: . Will get new block locations from namenode and retry...
20/08/04 09:07:21 WARN hdfs.DFSClient: DFS chooseDataNode: got # 3 IOException, will wait for 14554.977191829808 msec.
20/08/04 09:07:35 WARN hdfs.DFSClient: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
20/08/04 09:07:35 WARN hdfs.DFSClient: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz No live nodes contain current block Block locations: Dead nodes: . Throwing a BlockMissingException
20/08/04 09:07:35 WARN hdfs.DFSClient: DFS Read
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:995)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:638)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:888)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:945)
at java.io.DataInputStream.read(DataInputStream.java:100)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:88)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:62)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:122)
at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:467)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:392)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:329)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:264)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:249)
at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317)
at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:244)
at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)
at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)
at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:221)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:356)
copyToLocal: Could not obtain block: BP-551390946-23.1.22.254-1596451810664:blk_1073741831_1007 file=/hdp/apps/2.6.5.0-292/hive/hive.tar.gz
请运行执行以下命令并检查块是否已损坏?
hdfs fsck /
如果有损坏的块,那么您可能需要执行恢复过程。 对于恢复,您可以按照 link https://blog.cloudera.com/understanding-hdfs-recovery-processes-part-1/