spark 无法在 master 上启动 org.apache.spark.deploy.worker.worker
spark failed to launch org.apache.spark.deploy.worker.worker on master
我在两台 Ubuntu 服务器(主服务器和一台从服务器)上设置了 Spark 独立集群。
我有如下配置 /conf/spark-env.sh(从 spark-env.sh.template 复制后):
SPARK_MASTER_HOST="master"
我通过下面的命令在 master 上成功启动了 spark-master。
sudo /opt/spark/sbin/start-master.sh
但是在 master 上启动 spark-slave 结果出现错误:
hadoop@master:/opt/spark$ sudo ./sbin/start-slave.sh --master spark://master:7077 --cores 2
/opt/spark/conf/spark-env.sh: line 28: SPARK_LOCAL_IP: command not found
starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
failed to launch: nice -n 0 /opt/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 --master spark://master:7077 --cores 2
Options:
-c CORES, --cores CORES Number of cores to use
-m MEM, --memory MEM Amount of memory to use (e.g. 1000M, 2G)
-d DIR, --work-dir DIR Directory to run apps in (default: SPARK_HOME/work)
-i HOST, --ip IP Hostname to listen on (deprecated, please use --host or -h)
-h HOST, --host HOST Hostname to listen on
-p PORT, --port PORT Port to listen on (default: random)
--webui-port PORT Port for web UI (default: 8081)
--properties-file FILE Path to a custom Spark properties file.
Default is conf/spark-defaults.conf.
full log in /opt/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
我搜索了很多,但我没有弄清楚我的问题是什么?
我发现了我的错,我不应该使用 --master 关键字而只是 运行 命令
hadoop@master:/opt/spark$ sudo ./sbin/start-slave.sh spark://master:7077
按照本教程的步骤操作:https://phoenixnap.com/kb/install-spark-on-ubuntu
提示:确保在之前安装所有依赖项:
sudo apt install scala git -y
我在两台 Ubuntu 服务器(主服务器和一台从服务器)上设置了 Spark 独立集群。 我有如下配置 /conf/spark-env.sh(从 spark-env.sh.template 复制后):
SPARK_MASTER_HOST="master"
我通过下面的命令在 master 上成功启动了 spark-master。
sudo /opt/spark/sbin/start-master.sh
但是在 master 上启动 spark-slave 结果出现错误:
hadoop@master:/opt/spark$ sudo ./sbin/start-slave.sh --master spark://master:7077 --cores 2
/opt/spark/conf/spark-env.sh: line 28: SPARK_LOCAL_IP: command not found
starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
failed to launch: nice -n 0 /opt/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 --master spark://master:7077 --cores 2
Options:
-c CORES, --cores CORES Number of cores to use
-m MEM, --memory MEM Amount of memory to use (e.g. 1000M, 2G)
-d DIR, --work-dir DIR Directory to run apps in (default: SPARK_HOME/work)
-i HOST, --ip IP Hostname to listen on (deprecated, please use --host or -h)
-h HOST, --host HOST Hostname to listen on
-p PORT, --port PORT Port to listen on (default: random)
--webui-port PORT Port for web UI (default: 8081)
--properties-file FILE Path to a custom Spark properties file.
Default is conf/spark-defaults.conf.
full log in /opt/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
我搜索了很多,但我没有弄清楚我的问题是什么?
我发现了我的错,我不应该使用 --master 关键字而只是 运行 命令
hadoop@master:/opt/spark$ sudo ./sbin/start-slave.sh spark://master:7077
按照本教程的步骤操作:https://phoenixnap.com/kb/install-spark-on-ubuntu
提示:确保在之前安装所有依赖项:
sudo apt install scala git -y