在 intellij 中以交互方式运行 Spark:`akka.version` not found

Running Spark interactively inside intellij: `akka.version` not found

我试图在 Intellij 的 Scala 工作表中运行 Spark,但收到一条错误消息 No configuration setting found for key 'akka.version'

工作表内容:

import org.apache.spark.SparkContext
val sc1 = new SparkContext("local[8]", "sc1")

完整堆栈跟踪:

import org.apache.spark.SparkContext
15/01/06 16:30:32 INFO spark.SecurityManager: Changing view acls to: tobber
15/01/06 16:30:32 INFO spark.SecurityManager: Changing modify acls to: tobber
15/01/06 16:30:32 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tobber); users with modify permissions: Set(tobber)
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
    at com.typesafe.config.impl.SimpleConfig.findKey(spark.sc0.tmp:111)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:132)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:138)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:146)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:151)
    at com.typesafe.config.impl.SimpleConfig.getString(spark.sc0.tmp:193)
    at akka.actor.ActorSystem$Settings.<init>(spark.sc0.tmp:132)
    at akka.actor.ActorSystemImpl.<init>(spark.sc0.tmp:466)
    at akka.actor.ActorSystem$.apply(spark.sc0.tmp:107)
    at akka.actor.ActorSystem$.apply(spark.sc0.tmp:100)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$doCreateActorSystem(spark.sc0.tmp:117)
    at org.apache.spark.util.AkkaUtils$anonfun.apply(spark.sc0.tmp:50)
    at org.apache.spark.util.AkkaUtils$anonfun.apply(spark.sc0.tmp:49)
    at org.apache.spark.util.Utils$anonfun$startServiceOnPort.apply$mcVI$sp(spark.sc0.tmp:1500)
    at scala.collection.immutable.Range.foreach$mVc$sp(spark.sc0.tmp:137)
    at org.apache.spark.util.Utils$.startServiceOnPort(spark.sc0.tmp:1491)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(spark.sc0.tmp:52)
    at org.apache.spark.SparkEnv$.create(spark.sc0.tmp:149)
    at org.apache.spark.SparkContext.<init>(spark.sc0.tmp:200)
    at org.apache.spark.SparkContext.<init>(spark.sc0.tmp:115)
    at apps.A$A1$A$A1.sc$lzycompute(spark.sc0.tmp:2)
    at apps.A$A1$A$A1.sc(spark.sc0.tmp:2)
    at apps.A$A1$A$A1.get$instance$sc(spark.sc0.tmp:2)
    at #worksheet#.#worksheet#(spark.sc0.tmp:9)

解决方法是改用Scala Console

在您的 spark 项目中,只需在 scala 文件中按 Ctrl + Shift + DCmd + Shift + D。粘贴代码并使用 Ctrl + EnterCmd + Enter.

运行