Creating sparkContext on Google Colab gives: `RuntimeError: Java gateway process exited before sending its port number`
Creating sparkContext on Google Colab gives: `RuntimeError: Java gateway process exited before sending its port number`
以下是依赖,安装成功。
!apt-get install openjdk-8-jre
!apt-get install scala
!pip install py4j
!wget -q https://downloads.apache.org/spark/spark-2.4.8/spark-2.4.8-bin-hadoop2.7.tgz
!tar xf spark-2.4.8-bin-hadoop2.7.tgz
!pip install -q findspark
现在创建 spark 上下文:
# Setting up environment variables
import os
os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["SPARK_HOME"] = "/content/spark-2.4.8-bin-hadoop2.7"
# export PYSPARK_SUBMIT_ARGS ="--master local[2]"
# Importing and initating spark
import findspark
findspark.init()
from pyspark.sql import SparkSession
spark = SparkSession.builder.master("local[*]").appName("Test Setup").getOrCreate()
sc = spark.sparkContext
我收到此错误:
RuntimeError: Java gateway process exited before sending its port
number
请注意,这是一个 colab notebook。任何形式的帮助都会很棒。
您可以安装 Pyspark using PyPI 作为替代:
For Python users, PySpark also provides pip installation from PyPI. This is usually for local usage or as a client to connect to a cluster instead of setting up a cluster itself.
安装pyspark + openjdk
%pip install pyspark==2.4.8
!apt-get install openjdk-8-jdk-headless -qq > /dev/null
创建 spark 会话
from pyspark.sql import SparkSession
spark = SparkSession.builder\
.master("local[*]")\
.appName("Test Setup")\
.getOrCreate()
在 Google Colab Notebook 中测试:
以下是依赖,安装成功。
!apt-get install openjdk-8-jre
!apt-get install scala
!pip install py4j
!wget -q https://downloads.apache.org/spark/spark-2.4.8/spark-2.4.8-bin-hadoop2.7.tgz
!tar xf spark-2.4.8-bin-hadoop2.7.tgz
!pip install -q findspark
现在创建 spark 上下文:
# Setting up environment variables
import os
os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["SPARK_HOME"] = "/content/spark-2.4.8-bin-hadoop2.7"
# export PYSPARK_SUBMIT_ARGS ="--master local[2]"
# Importing and initating spark
import findspark
findspark.init()
from pyspark.sql import SparkSession
spark = SparkSession.builder.master("local[*]").appName("Test Setup").getOrCreate()
sc = spark.sparkContext
我收到此错误:
RuntimeError: Java gateway process exited before sending its port number
请注意,这是一个 colab notebook。任何形式的帮助都会很棒。
您可以安装 Pyspark using PyPI 作为替代:
For Python users, PySpark also provides pip installation from PyPI. This is usually for local usage or as a client to connect to a cluster instead of setting up a cluster itself.
安装pyspark + openjdk
%pip install pyspark==2.4.8
!apt-get install openjdk-8-jdk-headless -qq > /dev/null
创建 spark 会话
from pyspark.sql import SparkSession
spark = SparkSession.builder\
.master("local[*]")\
.appName("Test Setup")\
.getOrCreate()
在 Google Colab Notebook 中测试: