Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
Currently the avro schema gets passed to AvroConversionHelper.createConverterToAvro which itself pocesses passed spark sql DataTypes recursively to resolve structs, arrays, etc. - the AvroSchema gets passed to recursions, but without selection of the relevant field and therefore schema of that field. That leads to a null pointer exception when decimal types will be processed, because in that case the schema of the filed will be retrieved by calling getField on the root schema which is not defined when we deal with nested records.
AvroConversionHelper.scala#L291
The proposed solution is to remove the dependency on the avro schema and derive the particular avro schema for the decimal converter creator case only.
Attachments
Issue Links
- is depended upon by
-
HUDI-901 Bug Bash 0.6.0 Tracking Ticket
- Resolved
- links to