Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26862

assertion failed in ParquetRowConverter

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 2.4.0
    • None
    • SQL
    • None

    Description

      When I run the following  query over a internal table (A and B are typed in string, C is Array[String])

      ```

      spark.read.table("table1")
      .select(col("A"), col("B"), col("C"))
      .withColumn("row_num", row_number().over(Window.partitionBy(col("A")).orderBy(col("B").desc)))

      ```

      I received error as 

       

      ```

      org.apache.spark.SparkException: Job aborted due to stage failure: Task 730 in stage 12.0 failed 4 times, most recent failure: Lost task 730.3 in stage 12.0 (TID 2468, hadoopworker650-dca1.prod.uber.internal, executor 88): java.lang.AssertionError: assertion failed

      at scala.Predef$.assert(Predef.scala:156)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter$ParquetArrayConverter.<init>(ParquetRowConverter.scala:514)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.org$apache$spark$sql$execution$datasources$parquet$ParquetRowConverter$$newConverter(ParquetRowConverter.scala:318)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter$$anonfun$7.apply(ParquetRowConverter.scala:190)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter$$anonfun$7.apply(ParquetRowConverter.scala:185)

      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

      at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

      at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

      at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)

      at scala.collection.AbstractTraversable.map(Traversable.scala:104)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.<init>(ParquetRowConverter.scala:185)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.org$apache$spark$sql$execution$datasources$parquet$ParquetRowConverter$$newConverter(ParquetRowConverter.scala:324)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter$$anonfun$7.apply(ParquetRowConverter.scala:190)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter$$anonfun$7.apply(ParquetRowConverter.scala:185)

      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)

      at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

      at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

      at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)

      at scala.collection.AbstractTraversable.map(Traversable.scala:104)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRowConverter.<init>(ParquetRowConverter.scala:185)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetRecordMaterializer.<init>(ParquetRecordMaterializer.scala:43)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetReadSupport.prepareForRead(ParquetReadSupport.scala:126)

      at org.apache.parquet.hadoop.InternalParquetRecordReader.initialize(InternalParquetRecordReader.java:204)

      at org.apache.parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:182)

      at org.apache.parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:140)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anonfun$buildReaderWithPartitionValues$1.apply(ParquetFileFormat.scala:452)

      at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anonfun$buildReaderWithPartitionValues$1.apply(ParquetFileFormat.scala:364)

      at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.org$apache$spark$sql$execution$datasources$FileScanRDD$$anon$$readCurrentFile(FileScanRDD.scala:124)

      at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.nextIterator(FileScanRDD.scala:177)

      at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.hasNext(FileScanRDD.scala:101)

      at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)

      at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

      at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$11$$anon$1.hasNext(WholeStageCodegenExec.scala:622)

      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)

      at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)

      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)

      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)

      at org.apache.spark.scheduler.Task.run(Task.scala:121)

      at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)

      at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)

      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)

      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

      at java.lang.Thread.run(Thread.java:748)

      ```

       

       

      NOTE: if I remove the windowing clause, everything fine

       

       

       

      This error only happens in 2.4.0 and switching back to 2.3.2 resolved the issue

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            codingcat Nan Zhu
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: