用下一个递增数字填充空值 |派斯帕克 | Python

Fill null values with next incrementing number | PySpark | Python

我有以下带有值的数据框。我想在 id 列中添加下一个 concecative id,它必须是唯一的,并且本质上是递增的。

+----------------+----+--------------------+
|local_student_id|  id|        last_updated|
+----------------+----+--------------------+
|          610931|null|                null|
|          599768|   3|2020-02-26 15:47:...|
|          633719|null|                null|
|          612949|   2|2020-02-26 15:47:...|
|          591819|   1|2020-02-26 15:47:...|
|          595539|   4|2020-02-26 15:47:...|
|          423287|null|                null|
|          641322|   5|2020-02-26 15:47:...|
+----------------+----+--------------------+

我想要低于预期的输出。有人可以帮我吗?我是 Pyspark 的新手。并且还想在 last_updated 列中添加当前时间戳。

+----------------+----+--------------------+
|local_student_id|  id|        last_updated|
+----------------+----+--------------------+
|          610931|   6|2020-02-26 16:00:...|
|          599768|   3|2020-02-26 15:47:...|
|          633719|   7|2020-02-26 16:00:...|
|          612949|   2|2020-02-26 15:47:...|
|          591819|   1|2020-02-26 15:47:...|
|          595539|   4|2020-02-26 15:47:...|
|          423287|   8|2020-02-26 16:00:...|
|          641322|   5|2020-02-26 15:47:...|
+----------------+----+--------------------+

其实我试过了

final_data = final_data.withColumn(
        'id', when(col('id').isNull(), row_number() + max(col('id'))).otherwise(col('id')))

但它给出了以下错误:-

: org.apache.spark.sql.AnalysisException: grouping expressions sequence is empty, and '`local_student_id`' is not an aggregate function. Wrap '(CASE WHEN (`id` IS NULL) THEN (CAST(row_number() AS BIGINT) + max(`id`)) ELSE `id` END AS `id`)' in windowing function(s) or wrap '`local_student_id`' in first() (or first_value) if you don't care which value you get.;;

这是您需要的代码:

from pyspark.sql import functions as F, Window

max_id = final_data.groupBy().max("id").collect()[0][0]

final_data.withColumn(
    "id",
    F.coalesce(
        F.col("id"),
        F.row_number().over(Window.orderBy("id")) + F.lit(max_id)
    )
).withColumn(
    "last_updated",
    F.coalesce(
        F.col("last_updated"),
        F.current_timestamp()
    )
)