Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10582

using dynamic-executor-allocation, if AM failed. the new AM will be started. But the new AM does not allocate executors to dirver

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.4.1, 1.5.1
    • 2.0.0
    • Spark Core
    • None

    Description

      During running tasks, when the total number of executors is the value of spark.dynamicAllocation.maxExecutors and the AM is failed. Then a new AM restarts. Because in ExecutorAllocationManager, the total number of executors does not changed, driver does not send RequestExecutors to AM to ask executors. Then the total number of executors is the value of spark.dynamicAllocation.initialExecutors . So the total number of executors in driver and AM is different.

      Attachments

        Issue Links

          Activity

            People

              jerryshao Saisai Shao
              KaiXinXIaoLei KaiXinXIaoLei
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: