Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-7971

Producer in Streams environment

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Minor
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: streams
    • Labels:

      Description

      Would be nice to have Producers that can emit messages to topic just like any producer but also have access to local stores from streams environment in Spring.

      consider case: I have event sourced ordering process like this:
      [EVENTS QUEUE] -> [MERGING PROCESS] -> [ORDERS CHANGELOG/KTABLE]

      Merging process uses local storage "opened orders" to easily apply new changes.

      Now I want to implement process of closing abandoned orders (orders that were started, but for too long there was no change and they hang in beginning status). Easiest way is to periodically scan "opened orders" store and produce "abandon event" for every order that meets criteria. The obnly way now i to create Transformer with punctuator and connect output to [EVENTS QUEUE]. That is obvious. but Transformer must be also connected to some input stream, but these events must be dropped as we want only the punctuator results. This causes unnecessary overhead in processing input messages (although they are just dropped) and it is not very elegant.

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              redguy666 Maciej Lizewski
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated: