-
Type:
Improvement
-
Status: Resolved
-
Priority:
P3
-
Resolution: Fixed
-
Affects Version/s: 0.4.0, 0.5.0
-
Fix Version/s: 2.0.0
-
Component/s: io-java-kafka
-
Labels:None
KafkaIO does not allow to override the serializer and deserializer settings of the Kafka consumer and producers it uses internally. Instead, it allows to set a `Coder`, and has a simple Kafka serializer/deserializer wrapper class that calls the coder.
I appreciate that allowing to use Beam coders is good and consistent with the rest of the system. However, is there a reason to completely disallow to use custom Kafka serializers instead?
This is a limitation when working with an Avro schema registry for instance, which requires custom serializers. One can write a `Coder` that wraps a custom Kafka serializer, but that means two levels of un-necessary wrapping.
In addition, the `Coder` abstraction is not equivalent to Kafka's `Serializer` which gets the topic name as input. Using a `Coder` wrapper would require duplicating the output topic setting in the argument to `KafkaIO` and when building the wrapper, which is not elegant and error prone.