生成用于查找的单行数据框

Generate a single row dataframe for lookup

这是我之前发布的跟进

第 1 步:

scala> spark.sql("select map('s1', 'p1', 's2', 'p2', 's3', 'p3') as lookup").show()
+--------------------+
|              lookup|
+--------------------+
|[s1 -> p1, s2 -> ...|
+--------------------+

第 2 步:

scala> val df = Seq(("s1", "p1"), ("s2", "p2"), ("s3", "p3")).toDF("s", "p")
df: org.apache.spark.sql.DataFrame = [s: string, p: string]

scala> df.show()
+---+---+
|  s|  p|
+---+---+
| s1| p1|
| s2| p2|
| s3| p3|
+---+---+

第 3 步:

scala> val df1 = df.selectExpr("map(s,p) lookup")
df1: org.apache.spark.sql.DataFrame = [cc: map<string,string>]

scala> df1.show()
+----------+
|    lookup|
+----------+
|[s1 -> p1]|
|[s2 -> p2]|
|[s3 -> p3]|
+----------+

我在第 3 步中的预期结果是我在第 1 步中获得的结果。我怎样才能实现它?

键和值的两列应该是aggregated into arrays before merging them into a map

import org.apache.spark.sql.functions._

df.agg(collect_list("s").as("s"), collect_list("p").as("p"))
    .select(map_from_arrays('s,'p).as("lookup"))
    .show(false)

输出:

+------------------------------+
|lookup                        |
+------------------------------+
|[s1 -> p1, s2 -> p2, s3 -> p3]|
+------------------------------+

如果没有 collect_list 调用,每一行都将单独转换为地图。