Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6144

When in cluster mode using ADD JAR with a hdfs:// sourced jar will fail

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 1.3.0
    • 1.3.0
    • Spark Core
    • None

    Description

      While in cluster mode if you use ADD JAR with a HDFS sourced jar it will fail trying to source that jar on the worker nodes with the following error:

      15/03/03 04:56:50 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
      java.io.FileNotFoundException: /yarn/nm/usercache/vagrant/appcache/application_1425166832391_0027/-19222735701425358546704_cache (No such file or directory)
              at java.io.FileInputStream.open(Native Method)
              at java.io.FileInputStream.<init>(FileInputStream.java:146)
      

      PR https://github.com/apache/spark/pull/4880

      Attachments

        Issue Links

          Activity

            People

              tleftwich Trystan Leftwich
              tleftwich Trystan Leftwich
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: