Scala 程序没有看到通过 SBT 下载的依赖项
Scala program isn't seeing dependencies downloaded via SBT
我正在编写一个脚本来尝试让 Cassandra 和 Spark 协同工作,但我什至无法编译程序。我使用 SBT 作为构建工具,并且我拥有程序声明所需的所有依赖项。我第一次 运行 sbt 运行 它下载了依赖项但是当它开始编译如下所示的 scala 代码时我会得到一个错误:
[info] Compiling 1 Scala source to /home/vagrant/ScalaTest/target/scala-2.10/classes...
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:6: not found: type SparkConf
[error] val conf = new SparkConf(true)
[error] ^
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:9: not found: type SparkContext
[error] val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Jun 5, 2015 2:40:09 PM
这是 SBT 构建文件
lazy val root = (project in file(".")).
settings(
name := "ScalaTest",
version := "1.0"
)
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M1"
这是实际的 Scala 程序
import com.datastax.spark.connector._
object ScalaTest {
def main(args: Array[String]) {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
}
}
这是我的目录结构
- ScalaTest
- build.sbt
- project
- src
- main
- scala
- ScalaTest.scala
- target
我不知道这是否是问题所在,但您没有导入 SparkConf
和 SparkContext
类 定义。因此尝试添加到您的 Scala 文件:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
我正在编写一个脚本来尝试让 Cassandra 和 Spark 协同工作,但我什至无法编译程序。我使用 SBT 作为构建工具,并且我拥有程序声明所需的所有依赖项。我第一次 运行 sbt 运行 它下载了依赖项但是当它开始编译如下所示的 scala 代码时我会得到一个错误:
[info] Compiling 1 Scala source to /home/vagrant/ScalaTest/target/scala-2.10/classes...
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:6: not found: type SparkConf
[error] val conf = new SparkConf(true)
[error] ^
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:9: not found: type SparkContext
[error] val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Jun 5, 2015 2:40:09 PM
这是 SBT 构建文件
lazy val root = (project in file(".")).
settings(
name := "ScalaTest",
version := "1.0"
)
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M1"
这是实际的 Scala 程序
import com.datastax.spark.connector._
object ScalaTest {
def main(args: Array[String]) {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
}
}
这是我的目录结构
- ScalaTest
- build.sbt
- project
- src
- main
- scala
- ScalaTest.scala
- target
我不知道这是否是问题所在,但您没有导入 SparkConf
和 SparkContext
类 定义。因此尝试添加到您的 Scala 文件:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext