如何在 SparkContext() 函数中添加 2 个 conf?

How to add 2 conf in SparkContext() function?

我尝试了以下方法:

conf = (
        SparkConf().set("spark.driver.maxResultSize", "0"),
        SparkConf().set("spark.sql.autoBroadcastJoinThreshold", "-1")        
        )
        
     
sc = SparkContext(conf=conf)

但是,我得到了以下错误: AttributeError: 'tuple' object has no attribute 'get'

这个有效:

conf = (
        SparkConf().set("spark.driver.maxResultSize", "0")     
        )
        
     
sc = SparkContext(conf=conf)

好的,我认为这有效:

conf = (
        pyspark.SparkConf().setAll([("spark.driver.maxResultSize", "0"),("spark.sql.autoBroadcastJoinThreshold", "-1") ])       
        )

        
sc = SparkContext(conf=conf)
conf = SparkConf()
conf = conf.set("spark.driver.maxResultSize", "0")
conf = conf.set("spark.sql.autoBroadcastJoinThreshold", "-1")
sc = SparkContext(conf=conf)

conf = SparkConf().set("spark.driver.maxResultSize", "0").set("spark.sql.autoBroadcastJoinThreshold", "-1")
sc = SparkContext(conf=conf)

两种方式都是可能的(实际上是相同的)

也可以在没有 conf 变量重新初始化的情况下使用 set

conf = SparkConf()
conf.set("spark.driver.maxResultSize", "0")
conf.set("spark.sql.autoBroadcastJoinThreshold", "-1")
sc = SparkContext(conf=conf)