Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-49311

Large INTERVAL SECOND values cannot be cast to decimal

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 4.0.0
    • 4.0.0
    • SQL

    Description

      Spark fails when trying to cast an `interval second` value to decimal where the number of microseconds requires 19 digits.

      scala> sql("select 1000000000000.000000::interval second").show(false)
      
      +---------------------------------------------+
      
      |CAST(1000000000000.000000 AS INTERVAL SECOND)|
      
      +---------------------------------------------+
      
      |INTERVAL '1000000000000' SECOND              |
      
      +---------------------------------------------+
      
       
      
       
      
      scala> sql("select 1000000000000.000000::interval second::decimal(38, 10)").show(false)
      
      org.apache.spark.SparkArithmeticException: [NUMERIC_VALUE_OUT_OF_RANGE.WITH_SUGGESTION]  0 cannot be represented as Decimal(18, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error, and return NULL instead. SQLSTATE: 22003
      

      Attachments

        Issue Links

          Activity

            People

              harshmotw-db Harsh Motwani
              harshmotw-db Harsh Motwani
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: