Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.4.0
-
None
-
Spark 2.4
Description
Steps:
- Launch Spark Shell
- bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=3 --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.executorIdleTimeout=60s --conf spark.dynamicAllocation.maxExecutors=5
- Submit a Job sc.parallelize(1 to 10000,116000).count()
- Check the YARN UI Executor Tab for the RUNNING application
- UI display as Number of cores 4 and Active Tasks column shows as 5
Expected:
It Number of Active Tasks should be same as Number of Cores.