Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-34225

Jars or file paths which contain spaces are generating FileNotFoundException exception

Rank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    Description

       Whenever I try to use jars or files whose paths contain spaces FileNotFoundException exception is generated

      just run spark-shell command having spaces in paths

      See below

      c:>spark-shell --files "c:\Program Files\...\myjar.jar"
      or
      c:>spark-shell --jars"c:\Program Files\...\myjar.jar"
      or 
      c:>spark-shell --conf spark.jars="c:\Program Files\...\myjar.jar"
      
      any combination produce the same exception
      
       java.io.FileNotFoundException: Jar c:\Program%20Files\........ not found
              at org.apache.spark.SparkContext.addLocalJarFile$1(SparkContext.scala:1833)
              at org.apache.spark.SparkContext.addJar(SparkContext.scala:1887)
              at org.apache.spark.SparkContext.$anonfun$new$11(SparkContext.scala:490)
              at org.apache.spark.SparkContext.$anonfun$new$11$adapted(SparkContext.scala:490)
              at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
              at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
              at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
              at org.apache.spark.SparkContext.<init>(SparkContext.scala:490)
              at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574)
              at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934)
              at scala.Option.getOrElse(Option.scala:189)
              at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928)
              at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
      

      I have noticed that the following function is causing the issue.

      def addJar(path: String): Unit = {
      ...
      } else {
        val uri = new Path(path).toUri
        

      the path as the string is 

      "file:///C:/Program%20Files/Nokia/....jar"

      this call generates the following uri

      "file:///C:/Program*%25*20Files/Nokia/....jar"

      which results in an invalid file name.

      Using 

      val uri = Utils.resolveURI(path)

      seems to resolve the issue.

       Until a fix is provided are there any workarounds to overcome this issue?

       

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            sarutak Kousuke Saruta
            lucian.timar Lucian Timar
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment