Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.2.0
-
None
-
1.2.0-rc1
Description
When running spark-submit with a job that immediately blows up (say due to init errors in the job code), there is no error output from spark-submit on the console.
When I ran spark-class directly, then I do see the error/stack trace on the console.
I believe the issue is in SparkSubmitDriverBootstrapper (I had spark.driver.memory set in spark-defaults.conf) not waiting for the RedirectThreads to flush/complete before exiting.
E.g. here:
I believe around line 165 or so, stdoutThread.join() and
stderrThread.join() calls are necessary to make sure the other threads
have had a chance to flush process.getInputStream/getErrorStream to
System.out/err before the process exits.
I've been tripped up by this in similar RedirectThread/process code, hence suspecting this is the problem.
Attachments
Issue Links
- links to