在 Spark 数据框中添加可为空的列
Adding a nullable column in Spark dataframe
在 Spark 中,文字列在添加时不可为空:
from pyspark.sql import SparkSession, functions as F
spark = SparkSession.builder.getOrCreate()
df = spark.createDataFrame([(1,)], ['c1'])
df = df.withColumn('c2', F.lit('a'))
df.printSchema()
# root
# |-- c1: long (nullable = true)
# |-- c2: string (nullable = false)
如何创建可为空的列?
我发现的最短方法 - 使用 when
(似乎不需要 otherwise
子句):
df = df.withColumn('c2', F.when(F.lit(True), F.lit('a')))
如果在 Scala 中:.withColumn("c2", when(lit(true), lit("a")))
完整测试结果:
from pyspark.sql import SparkSession, functions as F
spark = SparkSession.builder.getOrCreate()
df = spark.createDataFrame([(1,)], ['c1'])
df = df.withColumn('c2', F.when(F.lit(True), F.lit('a')))
df.show()
# +---+---+
# | c1| c2|
# +---+---+
# | 1| a|
# +---+---+
df.printSchema()
# root
# |-- c1: long (nullable = true)
# |-- c2: string (nullable = true)
在 Spark 中,文字列在添加时不可为空:
from pyspark.sql import SparkSession, functions as F
spark = SparkSession.builder.getOrCreate()
df = spark.createDataFrame([(1,)], ['c1'])
df = df.withColumn('c2', F.lit('a'))
df.printSchema()
# root
# |-- c1: long (nullable = true)
# |-- c2: string (nullable = false)
如何创建可为空的列?
我发现的最短方法 - 使用 when
(似乎不需要 otherwise
子句):
df = df.withColumn('c2', F.when(F.lit(True), F.lit('a')))
如果在 Scala 中:.withColumn("c2", when(lit(true), lit("a")))
完整测试结果:
from pyspark.sql import SparkSession, functions as F
spark = SparkSession.builder.getOrCreate()
df = spark.createDataFrame([(1,)], ['c1'])
df = df.withColumn('c2', F.when(F.lit(True), F.lit('a')))
df.show()
# +---+---+
# | c1| c2|
# +---+---+
# | 1| a|
# +---+---+
df.printSchema()
# root
# |-- c1: long (nullable = true)
# |-- c2: string (nullable = true)