如何在 spark 3.0+ 中获取星期几?

How to get week of month in spark 3.0+?

我找不到任何日期时间 formatting pattern 来获取 spark 3.0+ 中的星期几

随着 'W' 的弃用,是否有解决方案可以在不使用旧版选项的情况下获取每月的星期几。

以下代码不适用于 spark 3.2.1

df = df.withColumn("weekofmonth",f.date_format(f.col("Date"),"W"))

你可以尝试使用udf:

from pyspark.sql.functions import col,year,month,dayofmonth

df = spark.createDataFrame(
    [(1, "2022-04-22"), (2, "2022-05-12")], ("id", "date"))

from calendar import monthcalendar
def get_week_of_month(year, month, day):
    return next(
        (
            week_number
            for week_number, days_of_week in enumerate(monthcalendar(year, month), start=1)
            if day in days_of_week
        ),
        None,
    )
fn1 = udf(get_week_of_month)
df =df.withColumn('week_of_mon',fn1(year(col('date')),month(col('date')),dayofmonth(col('date'))))
display(df)