Scala from_json 函数在我使用选项时抛出错误

Scala from_json function is throwing error when i use options

下面是我的代码:

val opts = Map("allowUnquotedFieldNames" -> "true")
    
val df_withSchema = df.withColumn("Data", from_json(col("Item.Data.S"),schema, opts))

错误如下:

(<console>:198: error: overloaded method value from_json with alternatives:
  (e: org.apache.spark.sql.Column,schema: org.apache.spark.sql.Column,options: java.util.Map[String,String])org.apache.spark.sql.Column <and>
  (e: org.apache.spark.sql.Column,schema: String,options: scala.collection.immutable.Map[String,String])org.apache.spark.sql.Column <and>
  (e: org.apache.spark.sql.Column,schema: String,options: java.util.Map[String,String])org.apache.spark.sql.Column <and>
  (e: org.apache.spark.sql.Column,schema: org.apache.spark.sql.types.DataType,options: java.util.Map[String,String])org.apache.spark.sql.Column <and>
  (e: org.apache.spark.sql.Column,schema: org.apache.spark.sql.types.StructType,options: java.util.Map[String,String])org.apache.spark.sql.Column <and>
  (e: org.apache.spark.sql.Column,schema: org.apache.spark.sql.types.DataType,options: scala.collection.immutable.Map[String,String])org.apache.spark.sql.Column <and>
  (e: org.apache.spark.sql.Column,schema: org.apache.spark.sql.types.StructType,options: scala.collection.immutable.Map[String,String])org.apache.spark.sql.Column
 cannot be applied to (org.apache.spark.sql.Column, org.apache.spark.sql.types.StructType, scala.collection.Map[String,String])
           val df_withSchema = df.withColumn("Data", from_json(col("Item.Data.S"),schema, opts))
                                                     ^
,0,1388)

有人可以帮我理解为什么这段代码会抛出错误以及如何解决这个问题吗?

背景:

数据元素是嵌套的 JSON,存储为字符串。所以我为数据元素定义了一个模式,并使用 from_JSON 将其转换为 JSON。此外,Data 中的字段名称在引号下,因此我尝试在此处使用选项 (Map("allowUnquotedFieldNames" -> "true"))

它想要 scala.collection.immutable.Map,而你有 scala.collection.Map ...

我知道,人们会天真地期望能够使用任何类型的地图以及承诺不会改变它的代码,但它并不是那样工作的。实际上是相反的:immutable.MapMap.

的子类

尝试opts.toMap,应该可以解决问题。