更改默认的 pyspark 上下文

Change the default pyspark context

我是 运行 Cassandra,默认端口是 10.0.0.60。当我启动 pyspark 时,我得到一个默认上下文 sc。但是,我相信这是指向 Cassandra 127.0.0.1.

如何更改它以使其指向 10.0.0.60?

[idf@node1 python]$ pyspark
Python 2.7.11 |Anaconda custom (64-bit)| (default, Dec  6 2015, 18:08:32) 
Type "copyright", "credits" or "license" for more information.

IPython 4.1.2 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.
16/05/18 10:40:46 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 1.6.1
      /_/

Using Python version 2.7.11 (default, Dec  6 2015 18:08:32)
SparkContext available as sc, HiveContext available as sqlContext.

In [1]: 

执行时可以通过需要的配置 pyspark:

pyspark --conf spark.cassandra.connection.host=10.0.0.60

或将此添加到 SPARK_HOME/conf/spark-defaults.conf