Description
Currently, it looks we can omit the timestamp format as below:
import org.apache.spark.sql.functions._ Seq("2016-12-31 00:12:00.00").toDF("a").select(to_timestamp(col("a"))).show()
+----------------------------------------+
|to_timestamp(`a`, 'yyyy-MM-dd HH:mm:ss')|
+----------------------------------------+
| 2016-12-31 00:12:00|
+----------------------------------------+
whereas this does not work in SQL as below:
spark-sql> SELECT to_timestamp('2016-12-31 00:12:00.00'); Error in query: Invalid number of arguments for function to_timestamp; line 1 pos 7
It looks we could support this too. For to_date, it looks already working in SQL as well as other language APIs.
scala> Seq("2016-12-31").toDF("a").select(to_date(col("a"))).show() +----------+ |to_date(a)| +----------+ |2016-12-31| +----------+
spark-sql> SELECT to_date('2016-12-31');
2016-12-31