Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-31695

BigDecimal setScale is not working in Spark UDF

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Not A Problem
    • Affects Version/s: 2.3.4
    • Fix Version/s: None
    • Component/s: Spark Core, SQL
    • Labels:
      None

      Description

      I was trying to convert json column to map. I tried udf for converting json to map. but it is not working as expected.
       

      val df1 = Seq(("{\"k\":10.004}")).toDF("json")
      def udfJsonStrToMapDecimal = udf((jsonStr: String)=> { var jsonMap:Map[String,Any] = parse(jsonStr).values.asInstanceOf[Map[String, Any]]
           jsonMap.map{case(k,v) => (k,BigDecimal.decimal(v.asInstanceOf[Double]).setScale(6))}.toMap
      })
      val f = df1.withColumn("map",udfJsonStrToMapDecimal($"json"))
      scala> f.printSchema
      root
       |-- json: string (nullable = true)
       |-- map: map (nullable = true)
       |    |-- key: string
       |    |-- value: decimal(38,18) (valueContainsNull = true)
      

       

      instead of decimal(38,6) it converting the value as decimal(38,18)

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              Sarraju89 Saravanan Raju
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: