Spark 无法创建 SparkContext

Spark unable to create SparkContext

我正在尝试 create/get new/old SparkContext 使用此方法

    private final val externalConfig = ConfigFactory.parseFile(new File("/mnt/extra.conf"))

private final val conf = new SparkConf(true)
   .set("spark.cassandra.connection.host", externalConfig.getString("CASSANDRA_DB_IP"))
   .set("spark.cassandra.input.split.size_in_mb", "67108864")
   .set("spark.cassandra.output.throughput_mb_per_sec", "67108864")
   .set("spark.cassandra.output.consistency.level","LOCAL_ONE")
   .set("spark.cassandra.input.consistency.level","LOCAL_ONE")
   .set("spark.cassandra.auth.username", externalConfig.getString("CASSANDRA_USER"))            
   .set("spark.cassandra.auth.password", externalConfig.getString("CASSANDRA_PASSWORD"))
   .set("spark.sql.crossJoin.enabled", "true")
   .set("spark.executor.memory", "4g")
   .set("spark.driver.memory", "2g")
   .set("spark.logConf", "false")
   .setAppName(externalConfig.getString("APPLICATION_NAME"))

 conf.setMaster("local[*]")

    val sc = SparkContext.getOrCreate(conf)

我正在使用 spark 2.11。这是我的 Maven 依赖项

    <dependency>
                       <groupId>org.apache.spark</groupId>
                       <artifactId>spark-core_2.11</artifactId>
                       <version>2.0.2</version>

               </dependency>

               <dependency>
                       <groupId>org.apache.spark</groupId>
                       <artifactId>spark-sql_2.11</artifactId>
                       <version>2.0.2</version>
               </dependency>

               <!-- <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> 
                       <version>2.0.2</version> </dependency> -->

               <dependency>
                       <groupId>com.datastax.spark</groupId>
                       <artifactId>spark-cassandra-connector-unshaded_2.11</artifactId>
                       <version>2.0.0-M3</version>
               </dependency>
<dependency>
                       <groupId>org.apache.commons</groupId>
                       <artifactId>commons-lang3</artifactId>
                       <version>3.4</version>
               </dependency>

但是它给我这个错误

    io.netty.channel.ChannelException: Unable to create Channel from class class io.netty.channel.socket.nio.NioServerSocketChannel
       at io.netty.bootstrap.AbstractBootstrap$BootstrapChannelFactory.newChannel(AbstractBootstrap.java:455)
       at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:306)
       at io.netty.bootstrap.AbstractBootstrap.doBind(AbstractBootstrap.java:271)
       at io.netty.bootstrap.AbstractBootstrap.bind(AbstractBootstrap.java:267)
       at org.apache.spark.network.server.TransportServer.init(TransportServer.java:129)
       at org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:74)
       at org.apache.spark.network.TransportContext.createServer(TransportContext.java:104)
       at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:118)
       at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun.apply(NettyRpcEnv.scala:447)
       at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun.apply(NettyRpcEnv.scala:446)
       at org.apache.spark.util.Utils$$anonfun$startServiceOnPort.apply$mcVI$sp(Utils.scala:2171)
       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2162)
       at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:451)
       at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:44)
       at org.apache.spark.SparkEnv$.create(SparkEnv.scala:224)
       at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165)
       at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
       at org.apache.spark.SparkContext.<init>(SparkContext.scala:420)
       at com.coreanalytics.connection.SparkConnection$.<init>(SparkConnection.scala:151)
       at com.coreanalytics.connection.SparkConnection$.<clinit>(SparkConnection.scala)
       at com.coreanalytics.connection.SparkConnection.getSc(SparkConnection.scala:41)
       at com.system.user.UserManagement.<init>(UserManagement.scala:51)
       at com.mobiledeviceapi.servlets.UserManagement.Authentication.doGet(Authentication.java:79)
       at com.mobiledeviceapi.servlets.UserManagement.Authentication.doPost(Authentication.java:122)
       at javax.servlet.http.HttpServlet.service(HttpServlet.java:648)
       at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
       at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
       at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
       at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
       at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
       at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
       at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
       at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
       at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
       at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
       at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
       at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
       at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
       at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
       at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
       at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
       at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1521)
       at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1478)
       at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
       at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
       at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
       at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: io.netty.util.CharsetUtil.encoder(Ljava/nio/charset/Charset;)Ljava/nio/charset/CharsetEncoder;
       at io.netty.buffer.ByteBufUtil.<clinit>(ByteBufUtil.java:60)
       at io.netty.buffer.ByteBufAllocator.<clinit>(ByteBufAllocator.java:24)
       at io.netty.channel.DefaultChannelConfig.<init>(DefaultChannelConfig.java:53)
       at io.netty.channel.socket.DefaultServerSocketChannelConfig.<init>(DefaultServerSocketChannelConfig.java:45)
       at io.netty.channel.socket.nio.NioServerSocketChannel$NioServerSocketChannelConfig.<init>(NioServerSocketChannel.java:189)
       at io.netty.channel.socket.nio.NioServerSocketChannel$NioServerSocketChannelConfig.<init>(NioServerSocketChannel.java:187)
       at io.netty.channel.socket.nio.NioServerSocketChannel.<init>(NioServerSocketChannel.java:85)
       at io.netty.channel.socket.nio.NioServerSocketChannel.<init>(NioServerSocketChannel.java:70)
       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
       at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
       at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
       at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
       at java.lang.Class.newInstance(Class.java:442)
       at io.netty.bootstrap.AbstractBootstrap$BootstrapChannelFactory.newChannel(AbstractBootstrap.java:453)
       ... 47 more

我的配置设置了所有需要的选项。我不明白什么是实际问题。请帮助我。

使用此代码创建新的 SparkContext 并确保您使用正确的依赖项:

import org.apache.spark.{SparkConf, SparkContext}
val conf=new SparkConf().setAppName("ABC").setMaster("local[*]")
val sc=new SparkContext(conf)

您有依赖冲突:

io.netty netty-all 4.0.33.Final (from spark-cassandra-connector-unshaded_2.11)
io.netty netty-all 4.0.29.Final (from spark-core_2.11)

请尝试 shaded datastax 的库:

<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.11</artifactId>
    <version>2.0.0-RC1</version> <-- or the latest version: 2.0.0 -->
</dependency>