Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24606

Decimals multiplication and division may be null due to the result precision overflow

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 2.2.1
    • None
    • SQL
    • None

    Description

      Spark performs mul / div on Decimals via Java's BigDecimal, whose scale may be greater than its precision, with 38 precision limit. 

      If the result BigDecimal's precision is 38, and its scale is greater than 38 ( 39 e.g. ), the converted decimal (in spark SQL) is in precision of 40 ( = 39 + 1, and > 38 ).

       

      Run following SQLs to reproduce this:

      select (cast (1.0 as decimal(38,37))) * 1.8;
      select (cast (0.00000777776666655555444443333387654321 as decimal(38,37))) / 99;
      

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            adrianjian Yan Jian
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment