如何使用 spark-shell 导入自己的 scala 包?

How to import own scala package using spark-shell?

我已经为 spark-ml 库编写了一个 class,它使用了其中的另一个 classes。 如果要清楚,我的 class 是 RandomForestClassifier 的包装器。 现在我想有机会从 spark-shell.

导入这个 class

所以问题是:如何使包含我自己的 class 的包能够从 spark-shell 导入?非常感谢!

阅读文档:

In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc. Making your own SparkContext will not work. You can set which master the context connects to using the --master argument, and you can add JARs to the classpath by passing a comma-separated list to the --jars argument. You can also add dependencies (e.g. Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates to the --packages argument. Any additional repositories where dependencies might exist (e.g. SonaType) can be passed to the --repositories argument.

如果你想像 Hello.scala 一样导入未编译的文件,请在 spark shell 中执行以下操作:

scala> :load ./src/main/scala/Hello.scala