Description
I have a decimal database field that is defined as 10.2 (i.e. ##########.##). When I load it into Spark via sqlContext.jdbc(..), the type of the corresponding field in the DataFrame is DecimalType, with precisionInfo None. Because of that loss of precision information, SPARK-4176 is triggered when I try to .saveAsTable(..).
Attachments
Issue Links
- relates to
-
SPARK-4176 Support decimals with precision > 18 in Parquet
- Resolved
- links to