Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-20379

Update KafkaRecordDeserializationSchema to enable reuse of DeserializationSchema and KafkaDeserializationSchema

    XMLWordPrintableJSON

Details

    Description

      The new Kafka Connector defines its own deserialization schema and is incompatible with the existing library of deserializers.

      That means that users cannot use all of Flink's Formats (Avro, JSON, Csv, Protobuf, Confluent Schema Registry, ...) with the new Kafka Connector.

      I think we should change the new Kafka Connector to use the existing Deserialization classes, so all formats can be used, and users can reuse their deserializer implementations.

      It would also be good to use the existing KafkaDeserializationSchema. Otherwise all users need to migrate their sources again.

      Attachments

        Activity

          People

            Unassigned Unassigned
            sewen Stephan Ewen
            Votes:
            0 Vote for this issue
            Watchers:
            15 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: