Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-37692

sql-migration-guide wrong description

    XMLWordPrintableJSON

Details

    • Documentation
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 3.2.0
    • 3.2.1, 3.3.0
    • Documentation
    • None

    Description

      Description in the documentation:

      In Spark 3.2, the unit list interval literals can not mix year-month fields (YEAR and MONTH) and day-time fields (WEEK, DAY, …, MICROSECOND). For example, INTERVAL 1 day 1 hour is invalid in Spark 3.2. In Spark 3.1 and earlier, there is no such limitation and the literal returns value of CalendarIntervalType. To restore the behavior before Spark 3.2, you can set spark.sql.legacy.interval.enabled to true. 

      ”INTERVAL 1 day 1 hour is invalid in Spark 3.2.“ 

      Is this example correct? According to the description of DayTimeIntervalType, INTERVAL 1 day 1 hour is valid in Spark 3.2

      Attachments

        Activity

          People

            maxgekk Max Gekk
            JacobZheng JacobZheng
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: