如何使用 SparkR 在 apache spark 中写入 csv 文件?

How to write csv file in apache spark using SparkR?

我可以使用以下命令成功加载数据

sc = sparkR.init(master = 'local', sparkPackages = 'com.databricks:spark-csv_2.11:1.4.0')
sqlContext <- sparkRSQL.init(sc)
ss <- read.df(sqlContext, '/home/anmol/Downloads/Rgraphics/dataSets/states.csv', source = "com.databricks.spark.csv", inferSchema = "true")
head(ss)

我试过以下命令

write.df(df, '/home/anmol/faithfull.csv', source = 'com.databricks.spark.csv', 'overwrite')

但出现以下错误

16/06/10 18:28:26 ERROR RBackendHandler: save on 261 failed Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) : java.lang.NoClassDefFoundError: Could not initialize class com.databricks.spark.csv.util.CompressionCodecs$ at com.databricks.spark.csv.DefaultSource.createRelation(DefaultSource.scala:189) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148) at org.apache.spark.sql.DataFrame.save(DataFrame.scala:2027) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141) at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:86) at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38) at io.netty.channel.

问题是用于编译我的 apache spark 的版本 那是 2.10 所以我用了

sc <- sparkR.init(master = 'local', sparkPackages = 'com.databricks:spark-csv_2.10:1.4.0')

你可以通过登录 spark-shell 检查你的版本,它在启动时给出了 scala 的版本