Details
-
Documentation
-
Status: Resolved
-
Trivial
-
Resolution: Fixed
-
3.2.0
-
None
Description
Description in the documentation:
In Spark 3.2, the unit list interval literals can not mix year-month fields (YEAR and MONTH) and day-time fields (WEEK, DAY, …, MICROSECOND). For example, INTERVAL 1 day 1 hour is invalid in Spark 3.2. In Spark 3.1 and earlier, there is no such limitation and the literal returns value of CalendarIntervalType. To restore the behavior before Spark 3.2, you can set spark.sql.legacy.interval.enabled to true.
”INTERVAL 1 day 1 hour is invalid in Spark 3.2.“
Is this example correct? According to the description of DayTimeIntervalType, INTERVAL 1 day 1 hour is valid in Spark 3.2