Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.3.2, 2.4.0
-
None
Description
Test step to reproduce:
bin/spark-shell --master yarn --conf spark.executor.instances=3 sc.parallelize(1 to 10000, 10).map{ x => throw new RuntimeException("Bad executor")}.collect()
1)Open the application from History UI
2) Go to the executor tab
From History UI:
From Live UI: