Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-28724

Throw error message when cast out range decimal to long

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Incomplete
    • 2.3.0
    • None
    • SQL

    Description

      Maybe this is a bug in `Scala` when convert `BigDecimal` to `Long`, however Spark should keep the result correct when query the sqls below:

      spark-sql> select cast(20190801002382000052000000017638 as int);
      -1493203738
      
      spark-sql> select cast(20190801002382000052000000017638 as bigint);
      4671677505944388838
      
      

      After this patch, the result will throw AnalysisException :

      spark-sql> select cast(20190801002382000052000000017638 as bigint);
      Error in query: Decimal 20190801002382000052000000017638 does not fit in range [-9223372036854775808, 9223372036854775807] for type Long;

       

      For `toFloat/toDouble`, the result is reasonable:

      spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as double);
      Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as double)
      
      spark-sql> select cast(201908010023820000520000000176380000000000000000.0 as float); Error in query: DecimalType can only support precision up to 38 == SQL == select cast(201908010023820000520000000176380000000000000000.0 as float)
      

       

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              lishuming ShuMing Li
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: