Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-16048

Support read/write confluent schema registry avro data from Kafka

    XMLWordPrintableJSON

Details

    Description

      The background

      I found SQL Kafka connector can not consume avro data that was serialized by `KafkaAvroSerializer` and only can consume Row data with avro schema because we use `AvroRowDeserializationSchema/AvroRowSerializationSchema` to se/de data in  `AvroRowFormatFactory`. 

      I think we should support this because `KafkaAvroSerializer` is very common in Kafka.

      and someone met same question in stackoverflow[1].

      [1]https://stackoverflow.com/questions/56452571/caused-by-org-apache-avro-avroruntimeexception-malformed-data-length-is-negat/56478259

      The format details

      The factory identifier (or format id)

      There are 2 candidates now ~

      • avro-sr: the pattern borrowed from KSQL JSON_SR format [1]
      • avro-confluent: the pattern borrowed from Clickhouse AvroConfluent [2]

      Personally i would prefer avro-sr because it is more concise and the confluent is a company name which i think is not that suitable for a format name.

      The format attributes

      Options required Remark
      schema-registry.url true URL to connect to schema registry service
      schema-registry.subject false Subject name to write to the Schema Registry service, required for sink

      Attachments

        Issue Links

          Activity

            People

              danny0405 Danny Chen
              leonard Leonard Xu
              Votes:
              2 Vote for this issue
              Watchers:
              19 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: