使用 sparklyr 包安装 spark 时出错
Error in installing spark with sparklyr package
我正在尝试在 Mac 系统 (macOS Catalina) 上安装 sparklyr;而 运行 spark_install(),它开始下载包,然后失败。请看下面的代码重现。
> library(sparklyr)
> packageVersion("sparklyr")
[1] ‘1.5.2’
> system("java -version")
java version "15.0.2" 2021-01-19
Java(TM) SE Runtime Environment (build 15.0.2+7-27)
Java HotSpot(TM) 64-Bit Server VM (build 15.0.2+7-27, mixed mode, sharing)
> spark_install("3.0")
Installing Spark 3.0.1 for Hadoop 3.2 or later.
Downloading from:
- 'https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz'
Installing to:
- '~/spark/spark-3.0.1-bin-hadoop3.2'
trying URL 'https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz'
Content type 'application/x-gzip' length 224062525 bytes (213.7 MB)
===========
downloaded 50.0 MB
Error in download.file(installInfo$packageRemotePath, destfile = installInfo$packageLocalPath, :
download from 'https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz' failed
我也在 sparklyr GitHub 页面上发布了这个问题。李一涛提供了以下答案:
https://github.com/sparklyr/sparklyr/issues/2936
我在这里重复一下答案,可能对其他人有帮助。
运行 options(timeout=300)
然后重新安装软件包。
我正在尝试在 Mac 系统 (macOS Catalina) 上安装 sparklyr;而 运行 spark_install(),它开始下载包,然后失败。请看下面的代码重现。
> library(sparklyr)
> packageVersion("sparklyr")
[1] ‘1.5.2’
> system("java -version")
java version "15.0.2" 2021-01-19
Java(TM) SE Runtime Environment (build 15.0.2+7-27)
Java HotSpot(TM) 64-Bit Server VM (build 15.0.2+7-27, mixed mode, sharing)
> spark_install("3.0")
Installing Spark 3.0.1 for Hadoop 3.2 or later.
Downloading from:
- 'https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz'
Installing to:
- '~/spark/spark-3.0.1-bin-hadoop3.2'
trying URL 'https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz'
Content type 'application/x-gzip' length 224062525 bytes (213.7 MB)
===========
downloaded 50.0 MB
Error in download.file(installInfo$packageRemotePath, destfile = installInfo$packageLocalPath, :
download from 'https://archive.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop3.2.tgz' failed
我也在 sparklyr GitHub 页面上发布了这个问题。李一涛提供了以下答案:
https://github.com/sparklyr/sparklyr/issues/2936
我在这里重复一下答案,可能对其他人有帮助。
运行 options(timeout=300)
然后重新安装软件包。