SPARK 2.0: spark-infotheoretic-feature-selection java.lang.NoSuchMethodError: breeze.linalg.DenseMatrix
SPARK 2.0: spark-infotheoretic-feature-selection java.lang.NoSuchMethodError: breeze.linalg.DenseMatrix
我正在尝试使用 Spark 的 InfoGain 第三方 (https://github.com/sramirez/spark-infotheoretic-feature-selection) 包的 MRMR 功能。但是我的集群是 2.0,我得到了这个异常。即使我将所有必需的 Jar 文件添加到 spark class 路径。但它仍然不起作用。虽然它在本地机器上正常工作,但在集群上却不行。
异常:
18/03/29 01:16:43 WARN TaskSetManager: Lost task 3.0 in stage 14.0 (TID 47, EUREDWORKER3): java.lang.NoSuchMethodError: breeze.linalg.DenseMatrix$.canMapValues(Lscala/reflect/ClassTag;)Lbreeze/generic/UFunc$UImpl2;
at org.apache.spark.mllib.feature.InfoTheorySparse$$anonfun.apply(InfoTheory.scala:172)
at org.apache.spark.mllib.feature.InfoTheorySparse$$anonfun.apply(InfoTheory.scala:172)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$$anonfun$apply$$anonfun$apply.apply(PairRDDFunctions.scala:759)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$$anonfun$apply$$anonfun$apply.apply(PairRDDFunctions.scala:759)
at scala.collection.Iterator$$anon.next(Iterator.scala:409)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:214)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:935)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:866)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:670)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:330)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:281)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD$$anonfun.apply(RDD.scala:332)
at org.apache.spark.rdd.RDD$$anonfun.apply(RDD.scala:330)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:935)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:866)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:670)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:330)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:281)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Reference for Spark class path
是breeze的版本问题。我添加了一个旧版本 breeze_2.11_0.11 并将其更改为 breeze_2.11-0.13.2.jar 解决了问题。
我正在尝试使用 Spark 的 InfoGain 第三方 (https://github.com/sramirez/spark-infotheoretic-feature-selection) 包的 MRMR 功能。但是我的集群是 2.0,我得到了这个异常。即使我将所有必需的 Jar 文件添加到 spark class 路径。但它仍然不起作用。虽然它在本地机器上正常工作,但在集群上却不行。
异常:
18/03/29 01:16:43 WARN TaskSetManager: Lost task 3.0 in stage 14.0 (TID 47, EUREDWORKER3): java.lang.NoSuchMethodError: breeze.linalg.DenseMatrix$.canMapValues(Lscala/reflect/ClassTag;)Lbreeze/generic/UFunc$UImpl2;
at org.apache.spark.mllib.feature.InfoTheorySparse$$anonfun.apply(InfoTheory.scala:172)
at org.apache.spark.mllib.feature.InfoTheorySparse$$anonfun.apply(InfoTheory.scala:172)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$$anonfun$apply$$anonfun$apply.apply(PairRDDFunctions.scala:759)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$$anonfun$apply$$anonfun$apply.apply(PairRDDFunctions.scala:759)
at scala.collection.Iterator$$anon.next(Iterator.scala:409)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:214)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:935)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:866)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:670)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:330)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:281)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD$$anonfun.apply(RDD.scala:332)
at org.apache.spark.rdd.RDD$$anonfun.apply(RDD.scala:330)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:935)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator.apply(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:866)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:926)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:670)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:330)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:281)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Reference for Spark class path
是breeze的版本问题。我添加了一个旧版本 breeze_2.11_0.11 并将其更改为 breeze_2.11-0.13.2.jar 解决了问题。