write.df sparkr 失败

write.df failing in sparkr

我正在尝试使用 SparkR 编写 SparkDataFrame。

write.df(spark_df,"/mypartition/enablers/Prod Data/data2/tempdata/tempdata_l2/","csv")

但是出现如下错误-

InsertIntoHadoopFsRelationCommand: Aborting job.
java.io.IOException: Failed to rename DeprecatedRawLocalFileStatus{path=file:/mypartition/enablers/Prod Data/data2/tempdata/tempdata_l2/_temporary/0/task_201610040736_0200_m_000112/part-r-00112-c4c5f30e-343d-4b02-a0f2-e9e5582047e5.snappy.parquet; isDirectory=false; length=331279; replication=1; blocksize=33554432; modification_time=1475566611000; access_time=0; owner=; group=; permission=rw-rw-rw-; isSymlink=false} to file:/mypartition/enablers/Prod Data/data2/tempdata/tempdata_l2/part-r-00112-c4c5f30e-343d-4b02-a0f2-e9e5582047e5.snappy.parquet
    at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:371)

此外,还报了如下错误-

WARN FileUtil: Failed to delete file or dir [/mypartition/enablers/Prod Data/data2/tempdata/tempdata_l2/_temporary/0/task_201610040736_0200_m_000110/.part-r-00110-c4c5f30e-343d-4b02-a0f2-e9e5582047e5.snappy.parquet.crc]: it still exists.

提前感谢您的宝贵见解。

未正确删除校验和文件。您能否尝试重命名校验和 (crc) 文件并重新执行。

cd /mypartition/enablers/Prod Data/data2/tempdata/tempdata_l2/__temporary/0/task_201610040736_0200_m_000110/

mv .part-r-00110-c4c5f30e-343d-4b02-a0f2-e9e5582047e5.snappy.parquet.crc .part-r-00110-c4c5f30e-343d-4b02-a0f2-e9e5582047e5.snappy.parquet.crc_backup

通过使用 root 用户解决了问题,最初 Spark 试图以 root 用户身份写入,但在删除它使用登录用户的临时文件时,将登录用户更改为 root 并解决了问题