Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-29579

Flink parquet reader cannot read fully optional elements in a repeated list

    XMLWordPrintableJSON

Details

    Description

      While trying to read a parquet file containing the following field as part of the schema, 

       optional group attribute_values (LIST) {
          repeated group list {
            optional group element {
              optional binary attribute_key_id (STRING);
              optional binary attribute_value_id (STRING);
              optional int32 pos;
            }
          }
        } 

       I encountered the following problem 

      Exception in thread "main" java.lang.UnsupportedOperationException: List field [optional binary attribute_key_id (STRING)] in List [attribute_values] has to be required. 
      	at org.apache.flink.formats.parquet.utils.ParquetSchemaConverter.convertGroupElementToArrayTypeInfo(ParquetSchemaConverter.java:338)
      	at org.apache.flink.formats.parquet.utils.ParquetSchemaConverter.convertParquetTypeToTypeInfo(ParquetSchemaConverter.java:271)
      	at org.apache.flink.formats.parquet.utils.ParquetSchemaConverter.convertFields(ParquetSchemaConverter.java:81)
      	at org.apache.flink.formats.parquet.utils.ParquetSchemaConverter.fromParquetType(ParquetSchemaConverter.java:61)
      	at org.apache.flink.formats.parquet.ParquetInputFormat.<init>(ParquetInputFormat.java:120)
      	at org.apache.flink.formats.parquet.ParquetRowInputFormat.<init>(ParquetRowInputFormat.java:39) 

      The main code that raises the problem goes as follows:

      private static ObjectArrayTypeInfo convertGroupElementToArrayTypeInfo(
                  GroupType arrayFieldType, GroupType elementType) {
              for (Type type : elementType.getFields()) {
                  if (!type.isRepetition(Type.Repetition.REQUIRED)) {
                      throw new UnsupportedOperationException(
                              String.format(
                                      "List field [%s] in List [%s] has to be required. ",
                                      type.toString(), arrayFieldType.getName()));
                  }
              }
              return ObjectArrayTypeInfo.getInfoFor(convertParquetTypeToTypeInfo(elementType));
          } 

      I am not very familiar with internals of Parquet schema. But the problem looks like to me is that Flink is too restrictive on repetition types inside certain nested fields. Would love to hear some feedbacks on this (improvements, corrections / workarounds).

      Attachments

        Activity

          People

            Unassigned Unassigned
            tiansu_yu Tiansu Yu
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: