Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
2.2.0
-
None
-
None
Description
We often see the issue of Spark jobs stuck because the Executor Allocation Manager does not ask for any executor even if there are pending tasks in case dynamic allocation is turned on. Looking at the logic in EAM which calculates the running tasks, it can happen that the calculation will be wrong and the number of running tasks can become negative.
Attachments
Attachments
Issue Links
- duplicates
-
SPARK-11334 numRunningTasks can't be less than 0, or it will affect executor allocation
- Resolved
- links to