spark jobserver 无法使用 Spark 2.0 构建

spark jobserver failing to build with Spark 2.0

我正在尝试 运行 使用 spark-2.0 的 spark-jobserver 我从 github 存储库克隆了分支 spark-2.0-preview。我按照部署指南进行操作,但是当我尝试使用 bin/server_deploy.sh 部署服务器时。我遇到编译错误:

 Error:
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:4: cannot find symbol
[error] symbol: class DataFrame
[error] location: package org.apache.spark.sql
[error] import org.apache.spark.sql.DataFrame;
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java:13: java.lang.Object cannot be converted to org.apache.spark.sql.Row[]
[error] return sc.sql(data.getString("sql")).collect();
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:25: cannot find symbol
[error] symbol: class DataFrame
[error] location: class spark.jobserver.JHiveTestLoaderJob
[error] final DataFrame addrRdd = sc.sql("SELECT * FROM default.test_addresses");
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JSqlTestJob.java:13: array required, but java.lang.Object found
[error] Row row = sc.sql("select 1+1").take(1)[0];
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Some input files use or override a deprecated API.
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Recompile with -Xlint:deprecation for details.
[error] (job-server-extras/compile:compileIncremental) javac returned nonzero exit code

我是不是忘记添加一些依赖项了?

我遇到了类似的问题。我发现这是一个错误,因为 Spark API 从 1.x 更改为 2.x。你可以在 github https://github.com/spark-jobserver/spark-jobserver/issues/760

上找到未解决的问题

我介绍了一些快速修复,解决了我的问题,我可以部署 jobserver。我为此提交了拉取请求。 https://github.com/spark-jobserver/spark-jobserver/pull/762