Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-28224

Check overflow in decimal Sum aggregate

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0
    • 3.0.0
    • SQL
    • None

    Description

      To reproduce:

      import spark.implicits._
      val ds = spark
        .createDataset(Seq(BigDecimal("1" * 20), BigDecimal("9" * 20)))
        .agg(sum("value"))
        .as[BigDecimal]
      ds.collect shouldEqual Seq(null)

      Given the option to throw exception on overflow on, sum aggregation of overflowing bigdecimal still remain null. DecimalAggregates is only invoked when expression of the sum (not the elements to be operated) has sufficiently small precision. The fix seems to be in Sum expression itself. 

       

      Attachments

        Issue Links

          Activity

            People

              mickjermsurawong-stripe Mick Jermsurawong
              mickjermsurawong-stripe Mick Jermsurawong
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: