SBT 无法导入 Kafka encoder/decoder 类
SBT cannot import Kafka encoder/decoder classes
项目设置:
- 1 个生产者 - 序列化对象并将字节发送到 Kafka
- 1 spark 消费者 - 应在 kafka.serializer 中使用 DefaultDecoder
消耗字节的包
问题:
- SBT 导入正确的库 (kafka-clients + kafka_2.10) 但
在 kafka_2.10 jar.
中找不到任何 类
- 好像是在错误的路径下搜索
(org.apache.spark.streaming.kafka 而不是 org.apache.kafka)。
错误信息::
object serializer is not a member of package org.apache.spark.streaming.kafka [error]
import kafka.serializer.DefaultDecoder.
sbt-tree
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
[info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
[info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq(
organization := "org.RssReaderDemo",
version := "0.1.0",
scalaVersion := "2.10.6"
)
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"
// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val avro = "org.apache.avro" % "avro" % "1.8.0"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
libraryDependencies += spark,
libraryDependencies += sparkStreaming,
libraryDependencies += sparkStreamKafka,
libraryDependencies += jacksonMapperAsl,
libraryDependencies += scalactic,
libraryDependencies += scalatest,
libraryDependencies += avro
)
这与SBT无关。你可能有类似
import org.apache.spark.streaming._
import kafka.serializer.DefaultDecoder
因为 org.apache.spark.streaming.kafka
包存在,此导入解析为 org.apache.spark.streaming.kafka.serializer.DefaultDecoder
。您可以按如下方式导入正确的 class:import _root_.kafka.serializer.DefaultDecoder
。有关 Scala 导入的更多详细信息,请参阅 https://wiki.scala-lang.org/display/SYGN/Language+FAQs#LanguageFAQs-HowdoIimport。
您需要在 "import org.apache.spark.streaming._" 之前 "import kafka.serializer.StringDecoder"。导入顺序可以解决问题。
有效 -
import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
异常 -
import org.apache.spark.streaming._
import kafka.serializer.StringDecoder
项目设置:
- 1 个生产者 - 序列化对象并将字节发送到 Kafka
- 1 spark 消费者 - 应在 kafka.serializer 中使用 DefaultDecoder 消耗字节的包
问题:
- SBT 导入正确的库 (kafka-clients + kafka_2.10) 但 在 kafka_2.10 jar. 中找不到任何 类
- 好像是在错误的路径下搜索 (org.apache.spark.streaming.kafka 而不是 org.apache.kafka)。
错误信息::
object serializer is not a member of package org.apache.spark.streaming.kafka [error]
import kafka.serializer.DefaultDecoder.
sbt-tree
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
[info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
[info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq(
organization := "org.RssReaderDemo",
version := "0.1.0",
scalaVersion := "2.10.6"
)
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"
// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val avro = "org.apache.avro" % "avro" % "1.8.0"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
libraryDependencies += spark,
libraryDependencies += sparkStreaming,
libraryDependencies += sparkStreamKafka,
libraryDependencies += jacksonMapperAsl,
libraryDependencies += scalactic,
libraryDependencies += scalatest,
libraryDependencies += avro
)
这与SBT无关。你可能有类似
import org.apache.spark.streaming._
import kafka.serializer.DefaultDecoder
因为 org.apache.spark.streaming.kafka
包存在,此导入解析为 org.apache.spark.streaming.kafka.serializer.DefaultDecoder
。您可以按如下方式导入正确的 class:import _root_.kafka.serializer.DefaultDecoder
。有关 Scala 导入的更多详细信息,请参阅 https://wiki.scala-lang.org/display/SYGN/Language+FAQs#LanguageFAQs-HowdoIimport。
您需要在 "import org.apache.spark.streaming._" 之前 "import kafka.serializer.StringDecoder"。导入顺序可以解决问题。
有效 -
import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
异常 -
import org.apache.spark.streaming._
import kafka.serializer.StringDecoder