Hi, I am using MySQL JDBC Driver along with Spark to do some sql queries.
When multiplying a LongType with a decimal in scientific notation, say
, decimal 2.34E10 will be treated as decimal(3,-8), and some_int will be casted as decimal(20,0).
So according to the rules in comments:
their multiplication will be decimal(3+20+1,-8+0) and thus fails the assert assumption (scale>=0) on DecimalType.scala:166.
My current workaround is to set spark.sql.decimalOperations.allowPrecisionLoss to false.