Details
Description
It seems that exceptions thrown in Python spark apps after the SparkContext is instantiated don't cause the application to fail, at least in Yarn: the application is marked as SUCCEEDED.
Note that any exception right before the SparkContext correctly places the application in FAILED state.
Attachments
Issue Links
- is duplicated by
-
SPARK-9416 Yarn logs say that Spark Python job has succeeded even though job has failed in Yarn cluster mode
- Resolved
-
SPARK-8612 Yarn application status is misreported for failed PySpark apps.
- Resolved
- links to