未能找到 'spark-submit2.cmd'

Failed to find 'spark-submit2.cmd'

> library('BBmisc')
> library('sparklyr')
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
  Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
> spark_home_dir()
[1] "C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7"
> spark_installed_versions()
  spark hadoop                                                              dir
1 3.0.0    2.7 C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> spark_home_set()
Setting SPARK_HOME environment variable to C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
  Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.

来源:https://github.com/englianhu/binary.com-interview-question/issues/1#issue-733943885

请问如何解决Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.?

参考:Need help getting started with Spark and sparklyr

已解决!!!

步骤:

  1. https://spark.apache.org/downloads.html
  2. 将压缩文件解压到 'C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2'
  3. 手动选择最新版本:spark_home_set('C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2')

GitHub 来源:https://github.com/englianhu/binary.com-interview-question/issues/1#event-3968919946