为什么从根本原因"failed to launch org.apache.spark.deploy.master.Master: JAVA_HOME is not set"开始-all.sh?

Why does start-all.sh from root cause "failed to launch org.apache.spark.deploy.master.Master: JAVA_HOME is not set"?

我正在尝试通过我的独立 Spark 服务 运行ning 在 cloudera quickstart VM 5.3.0 上执行通过 Scala IDE 构建的 Spark 应用程序。

我的cloudera账户JAVA_HOME是/usr/java/default

但是,我在从 cloudera 用户执行 start-all.sh 命令时遇到以下错误消息,如下所示:

[cloudera@localhost sbin]$ pwd
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin
[cloudera@localhost sbin]$ ./start-all.sh
chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs': Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out: Permission denied
failed to launch org.apache.spark.deploy.master.Master:
tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out' for reading: No such file or directory
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
cloudera@localhost's password: 
localhost: chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs': Operation not permitted
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out: Permission denied
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out' for reading: No such file or directory
localhost: full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out

我在 /etc/default/cloudera-scm-agent 和 运行 sudo service cloudera-scm-agent restart 中添加了 export CMF_AGENT_JAVA_HOME=/usr/java/default。参见 How to set CMF_AGENT_JAVA_HOME

我还在文件 /usr/share/cmf/bin/cmf-serverlocate_java_home 函数定义中添加了 export JAVA_HOME=/usr/java/default 并重新启动了集群和独立的 Spark 服务

但是从 root 用户

启动 spark 服务时重复出现以下错误
[root@localhost spark]# sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
failed to launch org.apache.spark.deploy.master.Master:
  JAVA_HOME is not set
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
root@localhost's password: 
localhost: Connection closed by UNKNOWN

有人可以建议如何设置 JAVA_HOME 以便在 cloudera 管理器上启动 Spark 独立服务吗?

解决方案非常简单直接。刚刚在 /root/.bashrc 中添加了 export JAVA_HOME=/usr/java/default,它成功地从 root 用户启动了 spark 服务,没有出现 JAVA_HOME is not set 错误。希望它能帮助面临同样问题的人。

~/.bashrc中设置JAVA_HOME变量如下

sudo gedit ~/.bashrc

在文件中写入这一行(你安装的地址JDK)

JAVA_HOME="/usr/lib/jvm/java-11-openjdk-amd64"

然后命令

source ~/.bashrc