如何将属于pyspark的一些日期和数据聚合到一行中?

How to aggregate some dates and data which belong to into one row in pyspark?

我想将一些日期(例如每个客户一个月)及其数据汇总到 pyspark 中的一行。

例子简单如下table

Customer_Id Date Data
id1 2021-01-01 2
id1 2021-01-02 3
id1 2021-01-03 4

我想改成

Customer_Id Date col1 col2 col3
id1 [2021-01-01 - 2021-01-03] 2 3 4

@matin 你可以尝试下面的代码来复制输出

from pyspark.sql.functions import *
schema = ["Customer_Id","Date","Data"]

data =[["id1",  "2021-01-01",   2],["id1","2021-01-02", 3],["id1","2021-01-03", 4]]
df = spark.createDataFrame(data,schema)
df2 = df.groupBy(["Customer_Id"]).agg(collect_list("Date").alias("list_date"),collect_list("data").alias("list_data")
                                                        )
df3= df2.withColumn("col1",df2.list_data[0]).withColumn("col2",df2.list_data[1]).withColumn("col3",df2.list_data[2]).drop("list_data")
df3.show(truncate=False)
df3.printSchema()

如果您需要进一步修改,请告诉我。