Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25454

Division between operands with negative scale can cause precision loss

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.3.0, 2.3.1
    • None
    • SQL

    Description

      The issue was originally reported by bersprockets here: https://issues.apache.org/jira/browse/SPARK-22036?focusedCommentId=16618104&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16618104.

      The problem consist in a precision loss when the second operand of the division is a decimal with a negative scale. It was present also before 2.3 but it was harder to reproduce: you had to do something like lit(BigDecimal(100e6)), while now this can happen more frequently with SQL constants.

      The problem is that our logic is taken from Hive and SQLServer where decimals with negative scales are not allowed. We might also consider enforcing this too in 3.0 eventually. Meanwhile we can fix the logic for computing the result type for a division.

      Attachments

        Activity

          People

            Unassigned Unassigned
            mgaido Marco Gaido
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: