Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.4.0
-
None
-
Spark Version:2.4
Description
Steps:
1. Submit Spark shell with below initial Executor 3, minimum Executor=0 and executorIdleTimeout=60s
bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true \
--conf spark.dynamicAllocation.initialExecutors=3 \
--conf spark.dynamicAllocation.minExecutors=0 \
--conf spark.dynamicAllocation.executorIdleTimeout=60s
2. Launch Spark UI and check under Executor Tab
Observation:
Initial 3 Executors assigned. After 60s( executorIdleTimeout) , number of active executor remains same.
Expected:
Apart from AM container, all other executors should be dead.
Attachments
Attachments
Issue Links
- is duplicated by
-
SPARK-26588 Idle executor should properly be killed when no job is submitted
- Resolved
- links to