从Pysp中的string列创建datetime

2024-04-29 22:27:31 发布

您现在位置:Python中文网/ 问答频道 /正文

假设我有如下所示的datetime列。我想将字符串中的列转换为datetime类型,以便可以提取月、日和年等。

+---+------------+
|agg|    datetime|
+---+------------+
|  A|1/2/17 12:00|
|  B|        null|
|  C|1/4/17 15:00|
+---+------------+

我已经尝试了下面的代码,但是datetime列中的返回值是空的,我现在不明白这是什么原因。

df.select(df['datetime'].cast(DateType())).show()

我也试过这个代码:

df = df.withColumn('datetime2', from_unixtime(unix_timestamp(df['datetime']), 'dd/MM/yy HH:mm'))

但是,它们都会产生这个数据帧:

+---+------------+---------+
|agg|    datetime|datetime2|
+---+------------+---------+
|  A|1/2/17 12:00|     null|
|  B|       null |     null|
|  C|1/4/17 12:00|     null|

我已经阅读并尝试了本文中指定的解决方案,但没有成功:PySpark dataframe convert unusual string format to Timestamp


Tags: 字符串代码类型dfdatetimeshow原因select
1条回答
网友
1楼 · 发布于 2024-04-29 22:27:31
// imports
import org.apache.spark.sql.functions.{dayofmonth,from_unixtime,month, unix_timestamp, year}

// Not sure if the datatype of the column is datetime or string
// I assume the column might be string, do the conversion
// created column datetime2 which is time stamp
val df2 = df.withColumn("datetime2", from_unixtime(unix_timestamp(df("datetime"), "dd/MM/yy HH:mm")))

+---+------------+-------------------+
|agg|    datetime|          datetime2|
+---+------------+-------------------+
|  A|1/2/17 12:00|2017-02-01 12:00:00|
|  B|        null|               null|
|  C|1/4/17 15:00|2017-04-01 15:00:00|
+---+------------+-------------------+


//extract month, year, day information
val df3 = df2.withColumn("month", month(df2("datetime2")))
  .withColumn("year", year(df2("datetime2")))
  .withColumn("day", dayofmonth(df2("datetime2")))
+---+------------+-------------------+-----+----+----+
|agg|    datetime|          datetime2|month|year| day|
+---+------------+-------------------+-----+----+----+
|  A|1/2/17 12:00|2017-02-01 12:00:00|    2|2017|   1|
|  B|        null|               null| null|null|null|
|  C|1/4/17 15:00|2017-04-01 15:00:00|    4|2017|   1|
+---+------------+-------------------+-----+----+----+

谢谢

相关问题 更多 >