Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.2.0
-
None
Description
Currently in Spark there're two issues when we add jars with invalid path:
- If the jar path is a empty string {--jar ",dummy.jar"}, then Spark will resolve it to the current directory path and add to classpath / file server, which is unwanted.
- If the jar path is a invalid path (file doesn't exist), file server doesn't check this and will still added file server, the exception will be thrown until job is running. This local path could be checked immediately, no need to wait until task running. We have similar check in addFile, but lacks similar one in addJar.