Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-40351

Spark Sum increases the precision of DecimalType arguments by 10

    XMLWordPrintableJSON

Details

    • Question
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 3.2.0
    • None
    • Optimizer
    • None

    Description

      Currently in Spark automatically increases Decimal field by 10 (hard coded value) after SUM aggregate operation - https://github.com/apache/spark/blob/branch-3.2/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala#L1877.

      There are a couple of questions:

      1. Why was 10 chosen as default one?
      2. Does it make sense to allow the user to override this value via configuration? 

      Attachments

        Activity

          People

            Unassigned Unassigned
            tkhomichuk Tymofii
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated: