Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
4.0.0
Description
Spark fails when trying to cast an `interval second` value to decimal where the number of microseconds requires 19 digits.
scala> sql("select 1000000000000.000000::interval second").show(false) +---------------------------------------------+ |CAST(1000000000000.000000 AS INTERVAL SECOND)| +---------------------------------------------+ |INTERVAL '1000000000000' SECOND | +---------------------------------------------+ scala> sql("select 1000000000000.000000::interval second::decimal(38, 10)").show(false) org.apache.spark.SparkArithmeticException: [NUMERIC_VALUE_OUT_OF_RANGE.WITH_SUGGESTION] 0 cannot be represented as Decimal(18, 6). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error, and return NULL instead. SQLSTATE: 22003
Attachments
Issue Links
- links to