Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-23148

spark.read.csv with multiline=true gives FileNotFoundException if path contains spaces

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.3.0
    • Fix Version/s: 2.3.0
    • Component/s: SQL
    • Labels:
      None

      Description

      Repro code:

      spark.range(10).write.csv("/tmp/a b c/a.csv")
      spark.read.option("multiLine", false).csv("/tmp/a b c/a.csv").count
      10
      spark.read.option("multiLine", true).csv("/tmp/a b c/a.csv").count
      java.io.FileNotFoundException: File file:/tmp/a%20b%20c/a.csv/part-00000-cf84f9b2-5fe6-4f54-a130-a1737689db00-c000.csv does not exist
      

      Trying to manually escape fails in a different place:

      spark.read.option("multiLine", true).csv("/tmp/a%20b%20c/a.csv").count
      org.apache.spark.sql.AnalysisException: Path does not exist: file:/tmp/a%20b%20c/a.csv;
        at org.apache.spark.sql.execution.datasources.DataSource$.org$apache$spark$sql$execution$datasources$DataSource$$checkAndGlobPathIfNecessary(DataSource.scala:683)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$15.apply(DataSource.scala:387)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$15.apply(DataSource.scala:387)
        at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
        at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
        at scala.collection.immutable.List.foreach(List.scala:381)
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                henryr Henry Robinson
                Reporter:
                bograd Bogdan Raducanu
              • Votes:
                0 Vote for this issue
                Watchers:
                7 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: