Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26758

Idle Executors are not getting killed after spark.dynamicAllocation.executorIdleTimeout value

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.4.0
    • 2.3.4, 2.4.1, 3.0.0
    • Spark Core, YARN
    • None
    • Spark Version:2.4

    Description

      Steps:

      1. Submit Spark shell with below initial Executor 3, minimum Executor=0 and executorIdleTimeout=60s

      bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true \
        --conf spark.dynamicAllocation.initialExecutors=3 \
        --conf spark.dynamicAllocation.minExecutors=0 \
        --conf spark.dynamicAllocation.executorIdleTimeout=60s
      

      2. Launch Spark UI and check under Executor Tab

      Observation:

      Initial 3 Executors assigned. After 60s( executorIdleTimeout) , number of active executor remains same.

      Expected:

      Apart from AM container, all other executors should be dead.

       

       

      Attachments

        1. SPARK-26758.png
          53 kB
          ABHISHEK KUMAR GUPTA

        Issue Links

          Activity

            People

              sandeep.katta2007 Sandeep Katta
              abhishek.akg ABHISHEK KUMAR GUPTA
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: