Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22561

Dynamically update topics list for spark kafka consumer

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.1.0, 2.1.1, 2.2.0
    • None
    • DStreams

    Description

      The Spark Streaming application should allow to add new topic after streaming context is intialized and DStream is started. This is very useful feature specially when business is working multi geography or multi business units.

      For example initially I have spark-kakfa consumer listening for topics: ["topic-1"."topic-2"] and after couple of days I have added new topics to kafka ["topic-3","topic-4"], now is there a way to update spark-kafka consumer topics list and ask spark-kafka consumer to consume data for updated list of topics without stopping sparkStreaming application or sparkStreaming context.

      Attachments

        Activity

          People

            Unassigned Unassigned
            asethia6025 Arun
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: