Description
See the attached pic.
The executor 1 has 7 tasks, but in the Stages Page the total tasks of executor is 6.
to reproduce this simply start a shell:
$SPARK_HOME/bin/spark-shell --executor-cores 1 --executor-memory 1g --total-executor-cores 2 --master spark://localhost.localdomain:7077
Run job as fellows:
sc.parallelize(1 to 10000, 3).map{ x => throw new RuntimeException("Bad executor")}.collect()
Go to the stages page and you will see the Total Tasks is not right in
Aggregated Metrics by Executor
table.