Scala(Spark)连接数据框中的列
Scala (Spark) concatenate Columns in a dataframe
下面的代码如何连接列表中的多个值?
concat(myList.map(fld => col(fld)): _*)
根据 Spark documentation the signature of the concat function is concat(col1, col2, ..., colN)
. Given your list contains the column names, i.e: c1, c2 ... cN
, map
will convert each one of these into Column class objects. The conversion is done using the col function. Finally, the _*
will unpack the (converted to Column) list items, similarly to how python's * operator 作品,分配 concat
个参数。
下面的代码如何连接列表中的多个值?
concat(myList.map(fld => col(fld)): _*)
根据 Spark documentation the signature of the concat function is concat(col1, col2, ..., colN)
. Given your list contains the column names, i.e: c1, c2 ... cN
, map
will convert each one of these into Column class objects. The conversion is done using the col function. Finally, the _*
will unpack the (converted to Column) list items, similarly to how python's * operator 作品,分配 concat
个参数。