Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3929 Support for fixed-precision decimal
  3. SPARK-3933

Optimize decimal type in Spark SQL for those with small precision

Attach filesAttach ScreenshotVotersWatch issueWatchersLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • None
    • SQL
    • None

    Description

      With fixed-precision decimals, many decimal values will fit in a Long, so we can use a Decimal class with a mutable Long field to represent the unscaled value, rather than allocating a BigDecimal. We can then do some operations directly on these Long fields.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            matei Matei Alexandru Zaharia
            matei Matei Alexandru Zaharia
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment