Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19709

CSV datasource fails to read empty file

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.2.0
    • 2.2.0
    • SQL
    • None

    Description

      I just touch a and then ran the codes below:

      scala> spark.read.csv("a")
      java.util.NoSuchElementException: next on empty iterator
      	at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
      	at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)
      	at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.
      

      It seems we should produce an empty dataframe consistently with `spark.read.json("a")`.

      Attachments

        Activity

          People

            wojtek-szymanski Wojciech Szymanski
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: