Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-8958

Fix Kafka Streams JavaDocs with regard to used Serdes

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • None
    • None
    • streams

    Description

      In older released, Kafka Streams applied operator specific overwrites of Serdes as in-place overwrites. In newer releases, Kafka Streams tries to re-use Serdes more "aggressively" by pushing serde information downstream if the key and/or value did not change.

      However, we never updated the JavaDocs accordingly. For example `KStream#through(String topic)` JavaDocs say:

      Materialize this stream to a topic and creates a new {@code KStream} from the topic using default serializers, deserializers, and producer's {@link DefaultPartitioner}.
      

      The JavaDocs don't put into account that Serdes might have been set further upstream, and the defaults from the config would not be used.

      `KStream#through()` is just one example. We should address this through all JavaDocs over all operators (ie, KStream, KGroupedStream, TimeWindowedKStream, SessionWindowedKStream, KTable, and KGroupedTable.

      Attachments

        Activity

          People

            harsh201 Harsh Agrawal
            mjsax Matthias J. Sax
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated: