java.lang.ClassCastException: org.apache.hadoop.conf.Configuration 无法转换为 org.apache.hadoop.yarn.conf.YarnConfiguration
java.lang.ClassCastException: org.apache.hadoop.conf.Configuration cannot be cast to org.apache.hadoop.yarn.conf.YarnConfiguration
我是运行一个在cloudera中使用yarn的spark应用。
星火版本:2.1
我收到以下错误:
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found
binding in
[jar:file:/data/yarn/nm/filecache/13/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/cloudera/parcels/CDH-5.10.2-1.cdh5.10.2.p0.5/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation. SLF4J: Actual binding is of type
[org.slf4j.impl.Log4jLoggerFactory] 18/04/14 22:20:57 INFO
util.SignalUtils: Registered signal handler for TERM 18/04/14 22:20:57
INFO util.SignalUtils: Registered signal handler for HUP 18/04/14
22:20:57 INFO util.SignalUtils: Registered signal handler for INT
Exception in thread "main" java.lang.ClassCastException:
org.apache.hadoop.conf.Configuration cannot be cast to
org.apache.hadoop.yarn.conf.YarnConfiguration at
org.apache.spark.deploy.yarn.ApplicationMaster.(ApplicationMaster.scala:60)
at
org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main.apply$mcV$sp(ApplicationMaster.scala:764)
at
org.apache.spark.deploy.SparkHadoopUtil$$anon.run(SparkHadoopUtil.scala:67)
at
org.apache.spark.deploy.SparkHadoopUtil$$anon.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:763)
at
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
我通过验证SPARK_HOME变量中配置的spark版本是否与cloudera中安装的hadoop版本匹配来解决这个问题。
从下面link https://spark.apache.org/downloads.html你可以下载适合你需要的hadoop的版本。
cloudera 中的 haddop 版本可以通过以下方式找到:
$ hadoop version
我在尝试使用 Yarn Rest 启动 Spark 作业时遇到了同样的问题 API。
原因是缺少环境变量SPARK_YARN_MODE。添加此环境变量,一切正常:
export SPARK_YARN_MODE=true
我是运行一个在cloudera中使用yarn的spark应用。 星火版本:2.1
我收到以下错误:
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/yarn/nm/filecache/13/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.10.2-1.cdh5.10.2.p0.5/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 18/04/14 22:20:57 INFO util.SignalUtils: Registered signal handler for TERM 18/04/14 22:20:57 INFO util.SignalUtils: Registered signal handler for HUP 18/04/14 22:20:57 INFO util.SignalUtils: Registered signal handler for INT Exception in thread "main" java.lang.ClassCastException: org.apache.hadoop.conf.Configuration cannot be cast to org.apache.hadoop.yarn.conf.YarnConfiguration at org.apache.spark.deploy.yarn.ApplicationMaster.(ApplicationMaster.scala:60) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main.apply$mcV$sp(ApplicationMaster.scala:764) at org.apache.spark.deploy.SparkHadoopUtil$$anon.run(SparkHadoopUtil.scala:67) at org.apache.spark.deploy.SparkHadoopUtil$$anon.run(SparkHadoopUtil.scala:66) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:763) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
我通过验证SPARK_HOME变量中配置的spark版本是否与cloudera中安装的hadoop版本匹配来解决这个问题。 从下面link https://spark.apache.org/downloads.html你可以下载适合你需要的hadoop的版本。 cloudera 中的 haddop 版本可以通过以下方式找到:
$ hadoop version
我在尝试使用 Yarn Rest 启动 Spark 作业时遇到了同样的问题 API。 原因是缺少环境变量SPARK_YARN_MODE。添加此环境变量,一切正常:
export SPARK_YARN_MODE=true