Details
-
Question
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
3.2.0
-
None
-
None
Description
Currently in Spark automatically increases Decimal field by 10 (hard coded value) after SUM aggregate operation - https://github.com/apache/spark/blob/branch-3.2/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala#L1877.
There are a couple of questions:
- Why was 10 chosen as default one?
- Does it make sense to allow the user to override this value via configuration?