Description
The example below demonstrates the issue:
spark-sql> select cast(interval '10.123' second as decimal(1, 0)); [NUMERIC_VALUE_OUT_OF_RANGE] 0.000010 cannot be represented as Decimal(1, 0). If necessary set "spark.sql.ansi.enabled" to "false" to bypass this error.
The value 0.000010 is not related to 10.123.