如何从 Dataframe Map 访问包装数组
How to Acces a Wrappedarray from Dataframe's Map
我有一个这样的数据框:
+------+------------------------------------------------------------------------------+
|myKeys|myMaps |
+------+------------------------------------------------------------------------------+
|b |Map(b -> WrappedArray([1,o], [4,xxx]), a -> WrappedArray([1,o], [1,n], [1,n]))|
|a |Map(b -> WrappedArray([1,o], [4,n]), a -> WrappedArray([4,c], [1,n], [1,n])) |
|a |Map(b -> WrappedArray([4,o], [3,n]), a -> WrappedArray([4,o], [1,n], [1,n])) |
|b |Map(b -> WrappedArray([4,a], [3,n]), a -> WrappedArray([1,o], [1,n], [1,n])) |
+------+------------------------------------------------------------------------------+
使用此架构
root
|-- myKeys: string (nullable = false)
|-- myMaps: map (nullable = true)
| |-- key: string
| |-- value: array (valueContainsNull = true)
| | |-- element: struct (containsNull = true)
| | | |-- _1: string (nullable = true)
| | | |-- _2: string (nullable = true)
这是创建它的代码:
val x = sc.parallelize(Seq(
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "1", "o"), ("a", "1", "n"), ("b", "4", "xxx")),
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "1", "o"), ("a", "1", "n"), ("b", "4", "n")),
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "4", "o"), ("a", "1", "n"), ("b", "3", "n")),
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "4", "o"), ("a", "1", "n"), ("b", "3", "n"))
)).map(x => testSchema(x)).toDF("myArrays")
val y = x.withColumn("myKeys", lit("b"))
val getMap = udf((mouvements: mutable.WrappedArray[Row]) => {
val test = mouvements.toArray
.map(line => (line(0).toString, line(1).toString, line(2).toString))
.groupBy(_._1)
.map{case (k,values) => k -> values.map(x => (x._2, x._3))}
test})
val df_with_map = y.select($"myKeys", getMap($"myArrays") as "myMaps")
df_with_map show false
df_with_map printSchema
现在,我想访问数组的第二个元素,它的第一个元素等于 4,映射等式的键为 b。我应该得到这样的结果
+---+
|val|
+---+
|xxx|
|c |
|o |
|a |
+---+
我已经用这个 udf 尝试过这样做了:
val getMyValue = udf{(myKey: String, myMaps: Map[String, WrappedArray[Row]]) =>
val first_val= "4"
val myArrays = myMaps.get(myKey)
val res = myArrays.get.toArray.filter{x => x.getString(0) == first_val}
res
}
val df_value = df_with_map.select(getMyValue($"myKey",$"myMaps") as "myValue")
df_value show false
df_value printSchema
但是return错误
java.lang.UnsupportedOperationException: Schema for type org.apache.spark.sql.Row is not supported
一行:
val getMyValue = udf{(myKey: String, myMaps: Map[String, WrappedArray[Row]]) =>
你有什么想法吗?
使用:
val first_val = "4"
val df = Seq(
("b", Map("b" -> Seq(("1", "o"), ("4", "xxx"))))
).toDF("myKeys", "myMaps")
root
|-- myKeys: string (nullable = true)
|-- myMaps: map (nullable = true)
| |-- key: string
| |-- value: array (valueContainsNull = true)
| | |-- element: struct (containsNull = true)
| | | |-- _1: string (nullable = true)
| | | |-- _2: string (nullable = true)
df.select($"myMaps".getItem("b"))
.as[Seq[(String, String)]]
.flatMap(xs => xs.filter(_._1 == first_val).map(_._2))
编辑:
df.as[(String, Map[String,Seq[(String, String)]])].flatMap {
case (key, map) =>
map.getOrElse(key, Seq[(String, String)]()).filter(_._1 == first_val).map(_._2)
}
我有一个这样的数据框:
+------+------------------------------------------------------------------------------+
|myKeys|myMaps |
+------+------------------------------------------------------------------------------+
|b |Map(b -> WrappedArray([1,o], [4,xxx]), a -> WrappedArray([1,o], [1,n], [1,n]))|
|a |Map(b -> WrappedArray([1,o], [4,n]), a -> WrappedArray([4,c], [1,n], [1,n])) |
|a |Map(b -> WrappedArray([4,o], [3,n]), a -> WrappedArray([4,o], [1,n], [1,n])) |
|b |Map(b -> WrappedArray([4,a], [3,n]), a -> WrappedArray([1,o], [1,n], [1,n])) |
+------+------------------------------------------------------------------------------+
使用此架构
root
|-- myKeys: string (nullable = false)
|-- myMaps: map (nullable = true)
| |-- key: string
| |-- value: array (valueContainsNull = true)
| | |-- element: struct (containsNull = true)
| | | |-- _1: string (nullable = true)
| | | |-- _2: string (nullable = true)
这是创建它的代码:
val x = sc.parallelize(Seq(
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "1", "o"), ("a", "1", "n"), ("b", "4", "xxx")),
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "1", "o"), ("a", "1", "n"), ("b", "4", "n")),
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "4", "o"), ("a", "1", "n"), ("b", "3", "n")),
Array(("a", "1", "o"), ("a", "1", "n"), ("b", "4", "o"), ("a", "1", "n"), ("b", "3", "n"))
)).map(x => testSchema(x)).toDF("myArrays")
val y = x.withColumn("myKeys", lit("b"))
val getMap = udf((mouvements: mutable.WrappedArray[Row]) => {
val test = mouvements.toArray
.map(line => (line(0).toString, line(1).toString, line(2).toString))
.groupBy(_._1)
.map{case (k,values) => k -> values.map(x => (x._2, x._3))}
test})
val df_with_map = y.select($"myKeys", getMap($"myArrays") as "myMaps")
df_with_map show false
df_with_map printSchema
现在,我想访问数组的第二个元素,它的第一个元素等于 4,映射等式的键为 b。我应该得到这样的结果
+---+
|val|
+---+
|xxx|
|c |
|o |
|a |
+---+
我已经用这个 udf 尝试过这样做了:
val getMyValue = udf{(myKey: String, myMaps: Map[String, WrappedArray[Row]]) =>
val first_val= "4"
val myArrays = myMaps.get(myKey)
val res = myArrays.get.toArray.filter{x => x.getString(0) == first_val}
res
}
val df_value = df_with_map.select(getMyValue($"myKey",$"myMaps") as "myValue")
df_value show false
df_value printSchema
但是return错误
java.lang.UnsupportedOperationException: Schema for type org.apache.spark.sql.Row is not supported
一行:
val getMyValue = udf{(myKey: String, myMaps: Map[String, WrappedArray[Row]]) =>
你有什么想法吗?
使用:
val first_val = "4"
val df = Seq(
("b", Map("b" -> Seq(("1", "o"), ("4", "xxx"))))
).toDF("myKeys", "myMaps")
root
|-- myKeys: string (nullable = true)
|-- myMaps: map (nullable = true)
| |-- key: string
| |-- value: array (valueContainsNull = true)
| | |-- element: struct (containsNull = true)
| | | |-- _1: string (nullable = true)
| | | |-- _2: string (nullable = true)
df.select($"myMaps".getItem("b"))
.as[Seq[(String, String)]]
.flatMap(xs => xs.filter(_._1 == first_val).map(_._2))
编辑:
df.as[(String, Map[String,Seq[(String, String)]])].flatMap {
case (key, map) =>
map.getOrElse(key, Seq[(String, String)]()).filter(_._1 == first_val).map(_._2)
}