Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12133

Support dynamic allocation in Spark Streaming

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 2.0.0
    • Component/s: DStreams, Spark Core
    • Labels:
      None
    • Target Version/s:

      Description

      Dynamic allocation is a feature that allows your cluster resources to scale up and down based on the workload. Currently it doesn't work well with Spark streaming because of several reasons:

      (1) Your executors may never be idle since they run something every N seconds
      (2) You should have at least one receiver running always
      (3) The existing heuristics don't take into account length of batch queue
      ...

      The goal of this JIRA is to provide better support for using dynamic allocation in streaming. A design doc will be posted shortly.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                tdas Tathagata Das
                Reporter:
                andrewor14 Andrew Or
              • Votes:
                0 Vote for this issue
                Watchers:
                14 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: