Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
-
2
Description
When using flink 1.13.6 and hudi 0.13.0 cow + append + clustering mode, if the field list contains map type and aysnc clustering job scheduled, will throw exception:
The requested schema is not compatible with the file schema. incompatible types: required binary key (STRING) != optional binary key (STRING)
Root reason is HUDI-3378 change parquet reader. The latest parquet reader is compatible with spark but not fully compatible with flink due to flink parquet schema is different from spark parquet schema.
We will make two patch, the first patch fix this bug in 0.13.x. The last patch fix diff schema between flink parquet and spark parquet.