Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
2.2.0
-
None
-
None
-
Windows 10, United Kingdom
Description
org.apache.spark.sql.functions.unix_timestamp sets some valid dates to null.
The dates happen to be the at the start of Daylight Savings time (UK and possibly elsewhere).
val spark = SparkSession.builder.getOrCreate() import spark.implicits._ spark.sparkContext.parallelize( Seq("25/03/2012 00:59", "25/03/2012 01:00", "25/03/2012 01:59", "25/03/2012 02:01")) .toDF("date") .select(unix_timestamp($"date", "dd/MM/yyyy HH:mm")) .show(false) // results: // 1332637140, null, null, 1332637260