在 Storm 集群上部署拓扑时出错
Error while deploying topology on storm cluster
我正在尝试在 storm 上部署一个简单的字数统计拓扑 clustre.I 我正在使用 kafka 作为输入 (kafka Spout)。这是我遇到的错误
java.lang.NoClassDefFoundError: Could not initialize class org.apache.log4j.Log4jLoggerFactory at org.apache.log4j.Logger.getLogger(Logger.java:39) at kafka.utils.Logging$class.logger(Logging.scala:24) at kafka.consumer.SimpleConsumer.logger$lzycompute(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.logger(SimpleConsumer.scala:30) at kafka.utils.Logging$class.info(Logging.scala:67) at kafka.consumer.SimpleConsumer.info(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.liftedTree1(SimpleConsumer.scala:74) at kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:68) at kafka.consumer.SimpleConsumer.getOffsetsBefore(SimpleConsumer.scala:127) at kafka.javaapi.consumer.SimpleConsumer.getOffsetsBefore(SimpleConsumer.scala:79) at storm.kafka.KafkaUtils.getOffset(KafkaUtils.java:77) at storm.kafka.KafkaUtils.getOffset(KafkaUtils.java:67) at storm.kafka.PartitionManager.(PartitionManager.java:83) at storm.kafka.ZkCoordinator.refresh(ZkCoordinator.java:98) at storm.kafka.ZkCoordinator.getMyManagedPartitions(ZkCoordinator.java:69) at storm.kafka.KafkaSpout.nextTuple(KafkaSpout.java:135) at backtype.storm.daemon.executor$fn__4654$fn__4669$fn__4698.invoke(executor.clj:565) at backtype.storm.util$async_loop$fn__458.invoke(util.clj:463) at clojure.lang.AFn.run(AFn.java:24) at java.lang.Thread.run(Thread.java:745)
通过在我的风暴库中添加 log4j-over-slf4j-1.6.6.jar 解决了这个问题。
我正在使用
- 风暴 0.9.6
- 卡夫卡 0.9.0
对于storm 0.9.6,它使用slf4j。所以你需要从你的 kafka 参考中排除它并包括 log4j。以下是我的pom.xml
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.9.0.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<!--<exclusion>-->
<!--<groupId>log4j</groupId>-->
<!--<artifactId>log4j</artifactId>-->
<!--</exclusion>-->
</exclusions>
</dependency>
我正在尝试在 storm 上部署一个简单的字数统计拓扑 clustre.I 我正在使用 kafka 作为输入 (kafka Spout)。这是我遇到的错误
java.lang.NoClassDefFoundError: Could not initialize class org.apache.log4j.Log4jLoggerFactory at org.apache.log4j.Logger.getLogger(Logger.java:39) at kafka.utils.Logging$class.logger(Logging.scala:24) at kafka.consumer.SimpleConsumer.logger$lzycompute(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.logger(SimpleConsumer.scala:30) at kafka.utils.Logging$class.info(Logging.scala:67) at kafka.consumer.SimpleConsumer.info(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.liftedTree1(SimpleConsumer.scala:74) at kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:68) at kafka.consumer.SimpleConsumer.getOffsetsBefore(SimpleConsumer.scala:127) at kafka.javaapi.consumer.SimpleConsumer.getOffsetsBefore(SimpleConsumer.scala:79) at storm.kafka.KafkaUtils.getOffset(KafkaUtils.java:77) at storm.kafka.KafkaUtils.getOffset(KafkaUtils.java:67) at storm.kafka.PartitionManager.(PartitionManager.java:83) at storm.kafka.ZkCoordinator.refresh(ZkCoordinator.java:98) at storm.kafka.ZkCoordinator.getMyManagedPartitions(ZkCoordinator.java:69) at storm.kafka.KafkaSpout.nextTuple(KafkaSpout.java:135) at backtype.storm.daemon.executor$fn__4654$fn__4669$fn__4698.invoke(executor.clj:565) at backtype.storm.util$async_loop$fn__458.invoke(util.clj:463) at clojure.lang.AFn.run(AFn.java:24) at java.lang.Thread.run(Thread.java:745)
通过在我的风暴库中添加 log4j-over-slf4j-1.6.6.jar 解决了这个问题。
我正在使用
- 风暴 0.9.6
- 卡夫卡 0.9.0
对于storm 0.9.6,它使用slf4j。所以你需要从你的 kafka 参考中排除它并包括 log4j。以下是我的pom.xml
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.9.0.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<!--<exclusion>-->
<!--<groupId>log4j</groupId>-->
<!--<artifactId>log4j</artifactId>-->
<!--</exclusion>-->
</exclusions>
</dependency>