涉及 Spark+Scala 组合错误的非法循环引用
illegal cyclic reference involving error with Spark+Scala combine
我正面临 非法循环引用涉及 Spark+Scala 组合错误。
Error Ocured during job for '1473170880000000' and Error Message is scala.reflect.internal.Symbols$CyclicReference: illegal cyclic reference involving method srcip
at scala.reflect.internal.Symbols$Symbol$$anonfun$info.apply(Symbols.scala:1220)
at scala.reflect.internal.Symbols$Symbol$$anonfun$info.apply(Symbols.scala:1218)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.reflect.internal.Symbols$Symbol.lock(Symbols.scala:482)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1218)
at scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1374)
at scala.reflect.internal.Symbols$Symbol.privateWithin(Symbols.scala:1169)
at scala.reflect.internal.Symbols$Symbol.hasAccessBoundary(Symbols.scala:1176)
at scala.reflect.internal.HasFlags$class.isPublic(HasFlags.scala:111)
at scala.reflect.internal.Symbols$Symbol.isPublic(Symbols.scala:112)
at com.datastax.spark.connector.util.ReflectionUtil$$anonfun.apply(ReflectionUtil.scala:77)
Error Ocured during job for '1453743420000000' and Error Message is scala.MatchError: <error> (of class scala.reflect.internal.Types$ErrorType$)
at com.datastax.spark.connector.util.ReflectionUtil$.returnType(ReflectionUtil.scala:113)
当我尝试执行超过 1 个作业时会发生此错误 simultaneously.It 闻起来像多线程问题。不是吗?从 Cassandra 加载数据(执行我的第一个操作时)或将数据保存到 Cassandra rdd.saveToCassandra(...)
时出现此错误
我的依赖细节
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>dse-driver</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.2</version>
</dependency>
驱动程序错误日志
任何suggestions/help对此高度赞赏。有人遇到过这个问题吗?
问题终于解决了。我的应用程序和 spark 二进制文件是基于 Scala 2 构建的。10.It 似乎 Scala 2.10 有 reflection/multi 线程问题,它作为建议发布在我浏览过的一些论坛上。
解决方法是,我使用 Scala 2.11 构建我的应用程序并使用基于 2.11 构建的 Spark 库。11.And问题消失了。
更新的依赖项
org.apache.spark
spark-core_2.11
1.6.2
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>1.6.0</version>
</dependency>
希望这对某人有所帮助。
我正面临 非法循环引用涉及 Spark+Scala 组合错误。
Error Ocured during job for '1473170880000000' and Error Message is scala.reflect.internal.Symbols$CyclicReference: illegal cyclic reference involving method srcip
at scala.reflect.internal.Symbols$Symbol$$anonfun$info.apply(Symbols.scala:1220)
at scala.reflect.internal.Symbols$Symbol$$anonfun$info.apply(Symbols.scala:1218)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.reflect.internal.Symbols$Symbol.lock(Symbols.scala:482)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1218)
at scala.reflect.internal.Symbols$Symbol.initialize(Symbols.scala:1374)
at scala.reflect.internal.Symbols$Symbol.privateWithin(Symbols.scala:1169)
at scala.reflect.internal.Symbols$Symbol.hasAccessBoundary(Symbols.scala:1176)
at scala.reflect.internal.HasFlags$class.isPublic(HasFlags.scala:111)
at scala.reflect.internal.Symbols$Symbol.isPublic(Symbols.scala:112)
at com.datastax.spark.connector.util.ReflectionUtil$$anonfun.apply(ReflectionUtil.scala:77)
Error Ocured during job for '1453743420000000' and Error Message is scala.MatchError: <error> (of class scala.reflect.internal.Types$ErrorType$)
at com.datastax.spark.connector.util.ReflectionUtil$.returnType(ReflectionUtil.scala:113)
当我尝试执行超过 1 个作业时会发生此错误 simultaneously.It 闻起来像多线程问题。不是吗?从 Cassandra 加载数据(执行我的第一个操作时)或将数据保存到 Cassandra rdd.saveToCassandra(...)
我的依赖细节
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>dse-driver</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.2</version>
</dependency>
驱动程序错误日志
任何suggestions/help对此高度赞赏。有人遇到过这个问题吗?
问题终于解决了。我的应用程序和 spark 二进制文件是基于 Scala 2 构建的。10.It 似乎 Scala 2.10 有 reflection/multi 线程问题,它作为建议发布在我浏览过的一些论坛上。
解决方法是,我使用 Scala 2.11 构建我的应用程序并使用基于 2.11 构建的 Spark 库。11.And问题消失了。
更新的依赖项
org.apache.spark spark-core_2.11 1.6.2
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>1.6.0</version>
</dependency>
希望这对某人有所帮助。