Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26664

Make DecimalType's minimum adjusted scale configurable

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: In Progress
    • Minor
    • Resolution: Unresolved
    • 3.1.0
    • None
    • SQL
    • None

    Description

      Introduce a new conf flag that allows the user to set the value of DecimalType.MINIMAL_ADJUSTED_SCALE, currently a constant of 6, to match their workloads' needs.

      The new flag will be spark.sql.decimalOperations.minimumAdjustedScale.

      SPARK-22036 introduced a new conf flag spark.sql.decimalOperations.allowPrecisionLoss to match SQL Server's and new Hive's behavior of allowing precision loss when performing multiplication/division on big and small decimal numbers.
      Along with this feature, a fixed MINIMAL_ADJUSTED_SCALE was set to 6 for when precision loss is allowed.

      Some customer workload may needed a larger adjusted scale to match their business needs, and in exchange they may be willing to tolerate some more calculations overflowing the max precision, leading to nulls. So they would like the minimum adjusted scale to be configurable. Thus the need for a new conf.

      The default behavior after introducing this conf flag is not changed.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              rednaxelafx Kris Mok
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated: