为什么我的代码库警告我在 for/while 循环中使用 withColumn?

Why is my Code Repo warning me about using withColumn in a for/while loop?

我注意到我的代码存储库警告我在 for/while 循环中使用 withColumn 是一种反模式。为什么不推荐这样做?这不是PySpark的正常使用吗API?

我们在实践中注意到,在 for/while 循环中使用 withColumn 会导致查询计划性能不佳,如 here 中所讨论的那样。第一次在 Foundry 中编写代码时,这并不明显,因此我们构建了一个功能来警告您这种行为。

我们建议您遵循 Scala docs recommendation:

withColumn(colName: String, col: Column): DataFrame
Returns a new Dataset by adding a column or replacing the existing column that has the same name.

Since
2.0.0

Note
this method introduces a projection internally. Therefore, calling it multiple times, for instance, via loops in order to add multiple columns can generate big plans which can cause performance issues and even WhosebugException. To avoid this, use select with the multiple columns at once.

my_other_columns = [...]

df = df.select(
  *[col_name for col_name in df.columns if col_name not in my_other_columns],
  *[F.col(col_name).alias(col_name + "_suffix") for col_name in my_other_columns]
)

优于

my_other_columns = [...]

for col_name in my_other_columns:
  df = df.withColumn(
    col_name + "_suffix",
    F.col(col_name)
  )

虽然这在技术上可能是 PySpark API 的正常使用,但如果在您的工作中调用 withColumn 的次数过多,将导致查询计划性能不佳,因此我们希望您避免此问题完全。