如何将字符串日期列转换为 spark 中的列 unix 纪元
How to convert a column of string dates into a column unix epochs in spark
我是 spark 和 scala 的新手,我想将一列字符串日期转换为 Unix 纪元。我的数据框如下所示:
+----------+-------+
| Dates |Reports|
+----------+-------+
|2020-07-20| 34|
|2020-07-21| 86|
|2020-07-22| 129|
|2020-07-23| 98|
+--------+---------+
The output should be
+----------+-------+
| Dates |Reports|
+----------+-------+
|1595203200| 34|
|1595289600| 86|
|1595376000| 129|
|1595462400| 98|
+--------+---------+
``
使用unix_timestamp
.
val df = Seq(("2020-07-20")).toDF("date")
df.show
df.withColumn("unix_time", unix_timestamp('date, "yyyy-MM-dd")).show
+----------+
| date|
+----------+
|2020-07-20|
+----------+
+----------+----------+
| date| unix_time|
+----------+----------+
|2020-07-20|1595203200|
+----------+----------+
我是 spark 和 scala 的新手,我想将一列字符串日期转换为 Unix 纪元。我的数据框如下所示:
+----------+-------+
| Dates |Reports|
+----------+-------+
|2020-07-20| 34|
|2020-07-21| 86|
|2020-07-22| 129|
|2020-07-23| 98|
+--------+---------+
The output should be
+----------+-------+
| Dates |Reports|
+----------+-------+
|1595203200| 34|
|1595289600| 86|
|1595376000| 129|
|1595462400| 98|
+--------+---------+
``
使用unix_timestamp
.
val df = Seq(("2020-07-20")).toDF("date")
df.show
df.withColumn("unix_time", unix_timestamp('date, "yyyy-MM-dd")).show
+----------+
| date|
+----------+
|2020-07-20|
+----------+
+----------+----------+
| date| unix_time|
+----------+----------+
|2020-07-20|1595203200|
+----------+----------+