Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-9046 Decimal type support improvement and bug fix
  3. SPARK-9264

When dealing with Union/Intersect/Except, we cast FloatType/DoubleType to the wrong DecimalType

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.4.0
    • 1.5.0
    • SQL
    • None
    • Spark 1.5 release

    Description

      The problem is https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/HiveTypeCoercion.scala#L361-L362. When we union/intersect/except, for a column, if the data type of it in one table is a fixed precision decimal type and that in another table is a FloatType/DoubleType, we cast FloatType to Decimal(7, 7) and cast DoubleType to Decimal(15, 15), respectively. It is wrong.

      I tried the following in 1.4

      sqlContext.sql("select a from (select cast(200.101 as double) as a union all select cast(200.101 as decimal(6,3)) as a) tmp").show
      

      I got

      +-------------------+
      |                  a|
      +-------------------+
      |               null|
      |200.101000000000000|
      +-------------------+
      

      Attachments

        Activity

          People

            davies Davies Liu
            yhuai Yin Huai
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: