Pyspark 将字符串转换为时间戳
Pyspark convert string to timestamp
需要将格式为“12/1/2010 8:26”的字符串列转换为时间戳。
尝试使用以下代码:
F.to_timestamp(dataset.InvoiceDate,'MM/dd/yyyy HH:mm')
但是报错
Py4JJavaError: An error occurred while calling o640.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 123.0 failed 1 times, most recent failure: Lost task 0.0 in stage 123.0 (TID 119) (13c59da6fb19 executor driver): org.apache.spark.SparkUpgradeException: You may get a different result due to the upgrading of Spark 3.0: Fail to parse '12/1/2010 8:26' in the new parser. You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0, or set to CORRECTED and treat it as an invalid datetime string.
在这种情况下,如何将字符串转换为时间戳?
尝试:
F.to_timestamp(dataset.InvoiceDate,'M/d/y H:m')
需要将格式为“12/1/2010 8:26”的字符串列转换为时间戳。 尝试使用以下代码:
F.to_timestamp(dataset.InvoiceDate,'MM/dd/yyyy HH:mm')
但是报错
Py4JJavaError: An error occurred while calling o640.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 123.0 failed 1 times, most recent failure: Lost task 0.0 in stage 123.0 (TID 119) (13c59da6fb19 executor driver): org.apache.spark.SparkUpgradeException: You may get a different result due to the upgrading of Spark 3.0: Fail to parse '12/1/2010 8:26' in the new parser. You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0, or set to CORRECTED and treat it as an invalid datetime string.
在这种情况下,如何将字符串转换为时间戳?
尝试:
F.to_timestamp(dataset.InvoiceDate,'M/d/y H:m')