Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-35296

Dataset.observe fails with an assertion

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.1.1, 3.2.0
    • 3.0.3, 3.2.0, 3.1.3
    • SQL
    • None

    Description

      I hit this assertion error when using dataset.observe:

      java.lang.AssertionError: assertion failed
      	at scala.Predef$.assert(Predef.scala:208) ~[scala-library-2.12.10.jar:?]
      	at org.apache.spark.sql.execution.AggregatingAccumulator.setState(AggregatingAccumulator.scala:204) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.sql.execution.CollectMetricsExec.$anonfun$doExecute$2(CollectMetricsExec.scala:72) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.sql.execution.CollectMetricsExec.$anonfun$doExecute$2$adapted(CollectMetricsExec.scala:71) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.TaskContext$$anon$1.onTaskCompletion(TaskContext.scala:125) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.TaskContextImpl.$anonfun$markTaskCompleted$1(TaskContextImpl.scala:124) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.TaskContextImpl.$anonfun$markTaskCompleted$1$adapted(TaskContextImpl.scala:124) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.TaskContextImpl.$anonfun$invokeListeners$1(TaskContextImpl.scala:137) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.TaskContextImpl.$anonfun$invokeListeners$1$adapted(TaskContextImpl.scala:135) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) ~[scala-library-2.12.10.jar:?]
      	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) ~[scala-library-2.12.10.jar:?]
      	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) ~[scala-library-2.12.10.jar:?]
      	at org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:135) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:124) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.scheduler.Task.run(Task.scala:147) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497) ~[spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439) [spark-core_2.12-3.1.1.jar:3.1.1]
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500) [spark-core_2.12-3.1.1.jar:3.1.1]
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_282]
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_282]
      	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_282]
      

      A workaround, that I used was to add .coalesce(1) before calling this method.

      It happens in a quite complex query and I have not been able to reproduce this with a simpler query

      Added an screenshot of the debugger, at the moment of exception

      Attachments

        1. 2021-05-03_18-34.png
          100 kB
          Tanel Kiis

        Activity

          People

            sarutak Kousuke Saruta
            tanelk Tanel Kiis
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: