Description
https://spark.apache.org/docs/latest/sql-ref-datatypes.html
- SECOND, seconds within minutes and possibly fractions of a second [0..59.999999]{}
Doc shows SECOND is seconds within minutes, it's range should be [0, 59]
But testing shows 99 second is valid:
>>> spark.sql("select INTERVAL '10 01:01:99' DAY TO SECOND")
DataFrame[INTERVAL '10 01:02:39' DAY TO SECOND: interval day to second]{}
Meanwhile, minute range check is ok, see below:
>>> spark.sql("select INTERVAL '10 01:60:01' DAY TO SECOND")
requirement failed: minute 60 outside range [0, 59](line 1, pos 16)
== SQL ==
select INTERVAL '10 01:60:01' DAY TO SECOND
----------------^^^