Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3536

SELECT on empty parquet table throws exception

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.2.0
    • SQL

    Description

      Reported by matei. Reproduce as follows:

      scala> case class Data(i: Int)
      defined class Data
      
      scala> createParquetFile[Data]("testParquet")
      scala> parquetFile("testParquet").count()
      14/09/15 14:34:17 WARN scheduler.DAGScheduler: Creating new stage failed due to exception - job: 0
      java.lang.NullPointerException
      	at org.apache.spark.sql.parquet.FilteringParquetRowInputFormat.getSplits(ParquetTableOperations.scala:438)
      	at parquet.hadoop.ParquetInputFormat.getSplits(ParquetInputFormat.java:344)
      	at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
      

      Attachments

        Activity

          People

            ravi.pesala Ravindra Pesala
            marmbrus Michael Armbrust
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: