Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22900

remove unnecessary restrict for streaming dynamic allocation

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 2.3.0
    • None
    • DStreams
    • None

    Description

      When i set the conf `spark.streaming.dynamicAllocation.enabled=true`, the conf `num-executors` can not be set. As a result, it will allocate default 2 executors and all receivers will be run on this 2 executors, there may not be redundant cpu cores for tasks. it will stuck all the time.

      in my opinion, we should remove unnecessary restrict for streaming dynamic allocation. we can set `num-executors` and `spark.streaming.dynamicAllocation.enabled=true` together. when application starts, each receiver will be run on an executor.

      Attachments

        Activity

          People

            Unassigned Unassigned
            sharkd sharkd tu
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: