我如何将 spark 安装为守护进程

How do i install spark as a daemon

我使用本指南在两台机器上将 spark 作为主从启动:
https://www.tutorialkart.com/apache-spark/how-to-setup-an-apache-spark-cluster/
然后我为他们每个人制作 systemd .service 但是当我将它们作为服务启动时它们无法启动。这是我的 systemctl 状态:

● sparkslave.service - Spark Slave
   Loaded: loaded (/etc/systemd/system/sparkslave.service; enabled; ven
dor preset: enabled)
   Active: inactive (dead) since Mon 2019-12-09 07:30:22 EST; 55s ago
  Process: 31680 ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://1
72.16.3.90:7077 (code=exited, status=0/SUCCESS)
 Main PID: 31680 (code=exited, status=0/SUCCESS)

Dec 09 07:30:19 SparkSlave1 systemd[1]: Started Spark Slave.
Dec 09 07:30:19 SparkSlave1 start-slave.sh[31680]: starting org.apache.
spark.deploy.worker.Worker, logging to /usr/lib/spark/logs/spark-spark-
user-org.apache.spark.deploy.worker.Worker-1-SparkSlave1.out

这是我的 sparkslave.service:

[Unit]
Description=Spark Slave
After=network.target

[Service]
User=spark-user
WorkingDirectory=/usr/lib/spark/sbin
ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://172.16.3.90:7077
Restart=on-failure
RestartSec=10s

[Install]
WantedBy=multi-user.target

有什么问题?

服务类型必须从简单变为分叉:

[Service]
Type=forking