如何在 spark-shell 中使用 nscala_time?
How can I use nscala_time inside spark-shell?
我正在尝试在 spark-shell 中测试一些代码,我需要设置一些时间字段。我们将 nscala_time
用于 DateTime 功能。当我运行
$ scala -cp `ls -1 | tr "\n" ":"`
从我暂存的 jar 目录中,一切正常,我可以 运行
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> val current = DateTime.now
current: org.joda.time.DateTime = 2015-04-23T10:44:35.984-07:00
然而,当我用 spark-shell
1.3.0
尝试同样的事情时
$ spark-shell -cp `ls -1 | tr "\n" ":"
我最终在执行与 scala
控制台中相同的操作时遇到错误
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> val current = DateTime.now
java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at com.github.nscala_time.time.LowPriorityOrderingImplicits$class.ReadableInstantOrdering(Implicits.scala:64)
at com.github.nscala_time.time.Imports$.ReadableInstantOrdering(Imports.scala:20)
at com.github.nscala_time.time.OrderingImplicits$class.$init$(Implicits.scala:56)
at com.github.nscala_time.time.Imports$.<init>(Imports.scala:20)
at com.github.nscala_time.time.Imports$.<clinit>(Imports.scala)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
at $iwC$$iwC$$iwC.<init>(<console>:39)
at $iwC$$iwC.<init>(<console>:41)
at $iwC.<init>(<console>:43)
at <init>(<console>:45)
at .<init>(<console>:49)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
有人知道为什么我不能在 spark-shell 中使用 nscala_time 吗?
我使用 spark 1.3.0,它有更好的错误输出。它告诉我:
scala.reflect.internal.Types$TypeError: bad symbolic reference. A
signature in BuilderImplicits.class refers to term time in value
org.joda which is not available.
所以,我下载了丢失的 org.joda jar 和 运行 火花:
~ spark-shell --jars nscala-time_2.10-0.2.0.jar,joda-time-2.7.jar
...
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> val current = DateTime.now
current: org.joda.time.DateTime = 2015-04-23T22:35:50.344+03:00
另一个我的猜测是您应该尝试 运行 您的代码以粘贴模式 (scala>:paste
)。堆栈跟踪看起来非常接近 issue。也许问题出在 spark-shell 本身。
我正在尝试在 spark-shell 中测试一些代码,我需要设置一些时间字段。我们将 nscala_time
用于 DateTime 功能。当我运行
$ scala -cp `ls -1 | tr "\n" ":"`
从我暂存的 jar 目录中,一切正常,我可以 运行
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> val current = DateTime.now
current: org.joda.time.DateTime = 2015-04-23T10:44:35.984-07:00
然而,当我用 spark-shell
1.3.0
$ spark-shell -cp `ls -1 | tr "\n" ":"
我最终在执行与 scala
控制台中相同的操作时遇到错误
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> val current = DateTime.now
java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at com.github.nscala_time.time.LowPriorityOrderingImplicits$class.ReadableInstantOrdering(Implicits.scala:64)
at com.github.nscala_time.time.Imports$.ReadableInstantOrdering(Imports.scala:20)
at com.github.nscala_time.time.OrderingImplicits$class.$init$(Implicits.scala:56)
at com.github.nscala_time.time.Imports$.<init>(Imports.scala:20)
at com.github.nscala_time.time.Imports$.<clinit>(Imports.scala)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
at $iwC$$iwC$$iwC.<init>(<console>:39)
at $iwC$$iwC.<init>(<console>:41)
at $iwC.<init>(<console>:43)
at <init>(<console>:45)
at .<init>(<console>:49)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
有人知道为什么我不能在 spark-shell 中使用 nscala_time 吗?
我使用 spark 1.3.0,它有更好的错误输出。它告诉我:
scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in BuilderImplicits.class refers to term time in value org.joda which is not available.
所以,我下载了丢失的 org.joda jar 和 运行 火花:
~ spark-shell --jars nscala-time_2.10-0.2.0.jar,joda-time-2.7.jar
...
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> val current = DateTime.now
current: org.joda.time.DateTime = 2015-04-23T22:35:50.344+03:00
另一个我的猜测是您应该尝试 运行 您的代码以粘贴模式 (scala>:paste
)。堆栈跟踪看起来非常接近 issue。也许问题出在 spark-shell 本身。