PySpark window 函数获取日期列值等于日期的最后一行

PySpark window function to get last row with date column value equal to date

我正在尝试让 window 函数返回并在特定日期之前获取前一行,我不太确定出了什么问题,但它给了我前一行而不是指定的日期行。为了计算这个,我正在获取当前行日期并找到与该周相关的当前星期一

    def previous_day(date, dayOfWeek):
        return date_sub(next_day(date, "monday"), 7)
    spark_df = spark_df.withColumn("last_monday", previous_day(spark_df['calendarday'], "monday"))

然后我通过

计算当前日期和最近的前一个星期一之间的天数差
    d = F.datediff(spark_df['calendarday'], spark_df['last_monday'])
    spark_df = spark_df.withColumn("daysSinceMonday",d)

我可以从我的 daysSinceMonday 值中看出每一行都是正确的。接下来,我想创建一个 window 并选择它的第一行,但将它们按我设置的 d 值进行排列,但由于某种原因它不起作用。

    days = lambda i: i * 86400 
    w = (Window.partitionBy(column_list).orderBy(col('calendarday').cast("timestamp").cast("long")).rangeBetween(-days(d), 0))
    spark_df = spark_df.withColumn('PreviousYearUnique', first("indexCP").over(w))

    Starting Data Frame
    ## +---+-----------+-----------+--------+       
    ## | id|calendarday|last_monday| indexCP|
    ## +---+-----------+-----------+--------+
    ## |  1|2015-01-05 | 2015-01-05|  0.0076|
    ## |  1|2015-01-06 | 2015-01-05|  0.0026|
    ## |  1|2015-01-07 | 2015-01-05|  0.0016|
    ## |  1|2015-01-08 | 2015-01-05|  0.0006|
    ## |  2|2015-01-09 | 2015-01-05|  0.0012|
    ## |  2|2015-01-10 | 2015-01-05|  0.0014|
    ## |  1|2015-01-12 | 2015-01-12|  0.0026|
    ## |  1|2015-01-13 | 2015-01-12|  0.0086|
    ## |  1|2015-01-14 | 2015-01-12|  0.0046|
    ## |  1|2015-01-15 | 2015-01-12|  0.0021|
    ## |  2|2015-01-16 | 2015-01-12|  0.0042|
    ## |  2|2015-01-17 | 2015-01-12|  0.0099|
    ## +---+-----------+-----------+--------+

    New Data Frame Adding Previous last_mondays row indexCP as PreviousYearUnique
    ## +---+-----------+-----------+--------+--------------------+       
    ## | id|calendarday|last_monday| indexCP| PreviousYearUnique |
    ## +---+-----------+-----------+--------+--------------------+
    ## |  1|2015-01-05 | 2015-01-05|  0.0076|              0.0076|
    ## |  1|2015-01-06 | 2015-01-05|  0.0026|              0.0076|
    ## |  1|2015-01-07 | 2015-01-05|  0.0016|              0.0076|
    ## |  1|2015-01-08 | 2015-01-05|  0.0006|              0.0076|
    ## |  2|2015-01-09 | 2015-01-05|  0.0012|              0.0076|
    ## |  2|2015-01-10 | 2015-01-05|  0.0014|              0.0076|
    ## |  1|2015-01-12 | 2015-01-12|  0.0026|              0.0026|
    ## |  1|2015-01-13 | 2015-01-12|  0.0086|              0.0026|
    ## |  1|2015-01-14 | 2015-01-12|  0.0046|              0.0026|
    ## |  1|2015-01-15 | 2015-01-12|  0.0021|              0.0026|
    ## |  2|2015-01-16 | 2015-01-12|  0.0042|              0.0026|
    ## |  2|2015-01-17 | 2015-01-12|  0.0099|              0.0026|
    ## +---+-----------+-----------+--------+--------------------+

知道出了什么问题吗?

你可以 partitionBy last_monday 超过 calendardayunboundedPreceding window 上,然后使用 first

from pyspark.sql import functions as F
from pyspark.sql.window import Window

w=Window().partitionBy("last_monday")\
          .orderBy(F.to_date("calendarday","yyyy-MM-dd"))\
          .rowsBetween(Window.unboundedPreceding,Window.currentRow)

df.withColumn("PreviousYearUnique", F.first("indexCP").over(w)).show()


#+---+-----------+-----------+-------+------------------+
#| id|calendarday|last_monday|indexCP|PreviousYearUnique|
#+---+-----------+-----------+-------+------------------+
#|  1| 2015-01-05| 2015-01-05| 0.0076|            0.0076|
#|  1| 2015-01-06| 2015-01-05| 0.0026|            0.0076|
#|  1| 2015-01-07| 2015-01-05| 0.0016|            0.0076|
#|  1| 2015-01-08| 2015-01-05| 6.0E-4|            0.0076|
#|  2| 2015-01-09| 2015-01-05| 0.0012|            0.0076|
#|  2| 2015-01-10| 2015-01-05| 0.0014|            0.0076|
#|  1| 2015-01-12| 2015-01-12| 0.0026|            0.0026|
#|  1| 2015-01-13| 2015-01-12| 0.0086|            0.0026|
#|  1| 2015-01-14| 2015-01-12| 0.0046|            0.0026|
#|  1| 2015-01-15| 2015-01-12| 0.0021|            0.0026|
#|  2| 2015-01-16| 2015-01-12| 0.0042|            0.0026|
#|  2| 2015-01-17| 2015-01-12| 0.0099|            0.0026|
#+---+-----------+-----------+-------+------------------+