The from_json function uses a schema to convert a string into a Spark SQL struct. This schema can contain non-nullable fields. The underlying JsonToStructs expression does not check if a resulting struct respects the nullability of the schema. This leads to very weird problems in consuming expressions. In our case parquet writing would produce an illegal parquet file.
There are roughly solutions here:
- Assume that each field in schema passed to from_json is nullable, and ignore the nullability information set in the passed schema.
- Validate the object during runtime, and fail execution if the data is null where we are not expecting this.
I currently am slightly in favor of option 1, since this is the more performant option and a lot easier to do.