Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
Description
A user reported a problem that the Kafka Consumer doesn't work in that case: https://lists.apache.org/thread.html/r462a854e8a0ab3512e2906b40411624f3164ea3af7cba61ee94cd760%40%3Cuser.flink.apache.org%3E. We should use a different constructor for ListStateDescriptor that takes TypeSerializer here: https://github.com/apache/flink/blob/68cc21e4af71505efa142110e35a1f8b1c25fe6e/flink-connectors/flink-connector-kafka-base/src/main/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBase.java#L860. This will circumvent the check.
My full analysis from the email thread:
Unfortunately, the fact that the Kafka Sources use Kryo for state serialization is a very early design misstep that we cannot get rid of for now. We will get rid of that when the new source interface lands ([1]) and when we have a new Kafka Source based on that.
As a workaround, we should change the Kafka Consumer to go through a different constructor of ListStateDescriptor which directly takes a TypeSerializer instead of a TypeInformation here: [2]. This should sidestep the "no generic types" check.
[1] https://cwiki.apache.org/confluence/display/FLINK/FLIP-27%3A+Refactor+Source+Interface
[2] https://github.com/apache/flink/blob/68cc21e4af71505efa142110e35a1f8b1c25fe6e/flink-connectors/flink-connector-kafka-base/src/main/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBase.java#L860
Attachments
Issue Links
- duplicates
-
FLINK-12410 KafkaTopicPartition cannot be used as a POJO type because not all fields are valid POJO fields
- Closed
- is related to
-
FLINK-11911 KafkaTopicPartition is not a valid POJO
- Closed
- relates to
-
FLINK-11911 KafkaTopicPartition is not a valid POJO
- Closed
- links to