Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-19149

Compacted Kafka Topic can be interpreted as Changelog Stream

    XMLWordPrintableJSON

Details

    Description

      I would like to be able to interpret a compacted Kafka Topic as a upsert stream in Apache Flink. Similarly, I would like to be able to write an upsert stream to Kafka (into a compacted topic).

      In both cases, the (possibly implicit) primary key of the Flink SQL Table would need to correspond to the fields that make up the keys of the Kafka records.

      A message for an existing key (with a higher offset) corresponds to an udate.
      A message for an existing key with a null value is interpreted as a delete.

      I would like to be able to interpret a compacted Kafka Topic as a versioned table without creating an additional view (similar to Debezium/Canal; see https://cwiki.apache.org/confluence/display/FLINK/FLIP-132+Temporal+Table+DDL+and+Temporal+Table+Join)

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              knaufk Konstantin Knauf
              Votes:
              0 Vote for this issue
              Watchers:
              12 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: