Details
-
Bug
-
Status: Closed
-
Blocker
-
Resolution: Fixed
-
1.3.0
-
None
Description
While in cluster mode if you use ADD JAR with a HDFS sourced jar it will fail trying to source that jar on the worker nodes with the following error:
15/03/03 04:56:50 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0) java.io.FileNotFoundException: /yarn/nm/usercache/vagrant/appcache/application_1425166832391_0027/-19222735701425358546704_cache (No such file or directory) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.<init>(FileInputStream.java:146)
Attachments
Issue Links
- is broken by
-
SPARK-4687 SparkContext#addFile doesn't keep file folder information
- Resolved
- links to