Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-1573

KafkaIO does not allow using Kafka serializers and deserializers

    Details

    • Type: Improvement
    • Status: Closed
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 0.4.0, 0.5.0
    • Fix Version/s: 2.0.0
    • Component/s: io-java-kafka
    • Labels:
      None

      Description

      KafkaIO does not allow to override the serializer and deserializer settings of the Kafka consumer and producers it uses internally. Instead, it allows to set a `Coder`, and has a simple Kafka serializer/deserializer wrapper class that calls the coder.

      I appreciate that allowing to use Beam coders is good and consistent with the rest of the system. However, is there a reason to completely disallow to use custom Kafka serializers instead?

      This is a limitation when working with an Avro schema registry for instance, which requires custom serializers. One can write a `Coder` that wraps a custom Kafka serializer, but that means two levels of un-necessary wrapping.

      In addition, the `Coder` abstraction is not equivalent to Kafka's `Serializer` which gets the topic name as input. Using a `Coder` wrapper would require duplicating the output topic setting in the argument to `KafkaIO` and when building the wrapper, which is not elegant and error prone.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                rangadi Raghu Angadi
                Reporter:
                peay peay
              • Votes:
                0 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: