java.lang.NoClassDefFoundError 最大规模
java.lang.NoClassDefFoundError in scalatest
我有这样的测试class:
import org.apache.spark.SparkContext
import org.scalatest.{ConfigMap, BeforeAndAfterAll, FunSuite}
class MyTrainingSuiteIT extends FunSuite with BeforeAndAfterAll {
private[this] var _sc: SparkContext = null
private[this] val defaultCoresNumber = 1
private[this] val defaultMaster = s"local[$defaultCoresNumber]"
private[this] val defaultName = "some-spark-integration-test"
override def beforeAll(configMap: ConfigMap): Unit = {
super.beforeAll()
val mode = configMap.get("mode").get
mode match {
case "local" =>
val coresNumber = configMap.get("cores").get
_sc = new SparkContext(s"local[$coresNumber]", defaultName)
case "docker" =>
println("Docker was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
case "cluster" =>
val clusterType = configMap.get("clusterType").get
println(s"Cluster of type [$clusterType] was chosen.")
_sc = new SparkContext(defaultMaster, defaultName)
case _ =>
println("Unknow mode was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
}
}
override def afterAll(): Unit = {
_sc.stop()
_sc = null
super.afterAll()
}
test("Context testing") {
assert(defaultMaster == s"local[$defaultCoresNumber]")
}
test("Fail test") {
assert(3 === 2)
}
}
首先,我在 IntelliJ IDEA 中编译它,然后我尝试在终端中使用这样的命令执行它:
scala -classpath /home/Downloads/scalatest_2.10.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
在我执行 ScalaTest 后,window 被打开,我收到这样的消息:
事件:运行已中止
消息:未找到需要的 class。这可能是由于您的 运行 路径中存在错误。缺少 class:org/apache/spark/SparkContext
摘要:测试总数运行:0
套件:完成 0 个,中止 0 个
测试:成功 0 次,失败 0 次,取消 0 次,忽略 0 次,待处理 0 次
异常:java.lang.NoClassDefFoundError
我该如何解决这个问题?
这是 scala 命令的工作版本:
scala -classpath /home/Downloads/scalatest_2.10.jar:/home/spark/core-1.2.19.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
异常的来源是类路径中缺少的 spark 库。
正如@Ben 建议的那样,构建工具(例如 SBT)可以使 运行 您的测试更容易。
我有这样的测试class:
import org.apache.spark.SparkContext
import org.scalatest.{ConfigMap, BeforeAndAfterAll, FunSuite}
class MyTrainingSuiteIT extends FunSuite with BeforeAndAfterAll {
private[this] var _sc: SparkContext = null
private[this] val defaultCoresNumber = 1
private[this] val defaultMaster = s"local[$defaultCoresNumber]"
private[this] val defaultName = "some-spark-integration-test"
override def beforeAll(configMap: ConfigMap): Unit = {
super.beforeAll()
val mode = configMap.get("mode").get
mode match {
case "local" =>
val coresNumber = configMap.get("cores").get
_sc = new SparkContext(s"local[$coresNumber]", defaultName)
case "docker" =>
println("Docker was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
case "cluster" =>
val clusterType = configMap.get("clusterType").get
println(s"Cluster of type [$clusterType] was chosen.")
_sc = new SparkContext(defaultMaster, defaultName)
case _ =>
println("Unknow mode was chosen")
_sc = new SparkContext(defaultMaster, defaultName)
}
}
override def afterAll(): Unit = {
_sc.stop()
_sc = null
super.afterAll()
}
test("Context testing") {
assert(defaultMaster == s"local[$defaultCoresNumber]")
}
test("Fail test") {
assert(3 === 2)
}
}
首先,我在 IntelliJ IDEA 中编译它,然后我尝试在终端中使用这样的命令执行它:
scala -classpath /home/Downloads/scalatest_2.10.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
在我执行 ScalaTest 后,window 被打开,我收到这样的消息:
事件:运行已中止
消息:未找到需要的 class。这可能是由于您的 运行 路径中存在错误。缺少 class:org/apache/spark/SparkContext
摘要:测试总数运行:0
套件:完成 0 个,中止 0 个
测试:成功 0 次,失败 0 次,取消 0 次,忽略 0 次,待处理 0 次
异常:java.lang.NoClassDefFoundError
我该如何解决这个问题?
这是 scala 命令的工作版本:
scala -classpath /home/Downloads/scalatest_2.10.jar:/home/spark/core-1.2.19.jar org.scalatest.tools.Runner -R /home/hspark/datasource-tests.jar -s package.name.MyTrainingSuiteIT -Dmode=local -Dcores=2
异常的来源是类路径中缺少的 spark 库。 正如@Ben 建议的那样,构建工具(例如 SBT)可以使 运行 您的测试更容易。