Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10582

using dynamic-executor-allocation, if AM failed. the new AM will be started. But the new AM does not allocate executors to dirver

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.4.1, 1.5.1
    • Fix Version/s: 2.0.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      During running tasks, when the total number of executors is the value of spark.dynamicAllocation.maxExecutors and the AM is failed. Then a new AM restarts. Because in ExecutorAllocationManager, the total number of executors does not changed, driver does not send RequestExecutors to AM to ask executors. Then the total number of executors is the value of spark.dynamicAllocation.initialExecutors . So the total number of executors in driver and AM is different.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                jerryshao Saisai Shao
                Reporter:
                KaiXinXIaoLei KaiXinXIaoLei
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Due:
                  Created:
                  Updated:
                  Resolved: