PySpark:基于公共字符串列比较两个 Dataframes 并生成 Result Boolean withColumn()

PySpark: Compare two Dataframes based on common String column and generate Result Boolean withColumn()

我有两个数据帧 ddf_1 和 ddf_2,它们共享一个公共的字符串 ID 列。我的目标是在 ddf_1 中创建一个新的布尔值 is_fine 列,如果 ID 包含在 ddf_1 中则包含 True,如果 ID 不包含在 ddf_2 中则包含 False =23=] 和 ddf_2。

考虑这个示例数据:

#### test
#example data
data_1 = { 
    'fruits': ["apples", "banana", "cherry"],
    'myid': ['1-12', '2-12', '3-13'],
    'meat': ["pig", "cow", "chicken"]}

data_2 = { 
    'furniture': ["table", "chair", "lamp"],
    'myid': ['1-12', '0-11', '2-12'],
    'clothing': ["pants", "shoes", "socks"]}

df_1 = pd.DataFrame(data_1)
ddf_1 = spark.createDataFrame(df_1)

df_2 = pd.DataFrame(data_2)
ddf_2 = spark.createDataFrame(df_2)

我想像这样的函数:

def func(df_1, df_2, column_1, column_2):
    if df_1.column_1 != df_2.column_2:
       return df_1.withColumn('is_fact', False)
    else:
        return df_1.withColumn('is_fact', True)
    return df_1

所需的输出应如下所示:

利用 Spark SQL 解决此类问题:

query = """
select ddf_1.*,
case 
    when ddf_1.myid = ddf_2.myid  then True
    else False 
end as is_fine
from ddf_1 left outer join ddf_2 
on ddf_1.myid = ddf_2.myid
"""

display(spark.sql(query))

这是output

可以在my_id列的2个数据帧之间进行左外连接,使用简单的case语句推导出is_fine列,如下所示,

import pyspark.sql.functions as F

ddf_1.join(ddf_2, ddf_1.myid == ddf_2.myid, 'left')\
.withColumn('is_fine', F.when(ddf_2.myid.isNull(), False).otherwise(True))\
.select(ddf_1['fruits'], ddf_1['myid'], ddf_1['meat'], 'is_fine').show()

输出:

+------+----+-------+-------+
|fruits|myid|   meat|is_fine|
+------+----+-------+-------+
|cherry|3-13|chicken|  false|
|apples|1-12|    pig|   true|
|banana|2-12|    cow|   true|
+------+----+-------+-------+
#left join ddf2 on ddf1
result = (ddf_1.join(ddf_2, ddf_1.myid == ddf_2.myid, how='left')\
          #create is_fine column
          .withColumn('is_fine', F.when(ddf_2.myid.isNull(), False).otherwise(True)))\
          #select all columns from ddf_1, the new column is_fine and show
          .select(ddf_1["*"], "is_fine").show()