Description
I just touch a and then ran the codes below:
scala> spark.read.csv("a")
java.util.NoSuchElementException: next on empty iterator
at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)
at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.
It seems we should produce an empty dataframe consistently with `spark.read.json("a")`.