-
Type:
Bug
-
Status: Resolved
-
Priority:
Major
-
Resolution: Fixed
-
Affects Version/s: 2.3.2, 2.4.0
-
Component/s: Spark Core
-
Labels:None
Test step to reproduce:
bin/spark-shell --master yarn --conf spark.executor.instances=3 sc.parallelize(1 to 10000, 10).map{ x => throw new RuntimeException("Bad executor")}.collect()
1)Open the application from History UI
2) Go to the executor tab
From History UI:
From Live UI: