Details
-
Improvement
-
Status: Closed
-
Critical
-
Resolution: Duplicate
-
None
-
None
-
None
Description
I would like to be able to interpret a compacted Kafka Topic as a upsert stream in Apache Flink. Similarly, I would like to be able to write an upsert stream to Kafka (into a compacted topic).
In both cases, the (possibly implicit) primary key of the Flink SQL Table would need to correspond to the fields that make up the keys of the Kafka records.
A message for an existing key (with a higher offset) corresponds to an udate.
A message for an existing key with a null value is interpreted as a delete.
I would like to be able to interpret a compacted Kafka Topic as a versioned table without creating an additional view (similar to Debezium/Canal; see https://cwiki.apache.org/confluence/display/FLINK/FLIP-132+Temporal+Table+DDL+and+Temporal+Table+Join)
Attachments
Issue Links
- is duplicated by
-
FLINK-19857 FLIP-149: Introduce the upsert-kafka Connector
- Closed
- relates to
-
FLINK-18826 Support to emit and encode upsert messages to Kafka
- Closed