Description
Seen on an unrelated PR.
sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: Assert on query failed: Check total state rows = List(4), updated state rows = List(4): Array(1) did not equal List(4) incorrect updates rows org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528) org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560) org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501) org.apache.spark.sql.streaming.StateStoreMetricsTest$$anonfun$assertNumStateRows$1.apply(StateStoreMetricsTest.scala:28) org.apache.spark.sql.streaming.StateStoreMetricsTest$$anonfun$assertNumStateRows$1.apply(StateStoreMetricsTest.scala:23) org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1$$anonfun$apply$14.apply$mcZ$sp(StreamTest.scala:568) org.apache.spark.sql.streaming.StreamTest$class.verify$1(StreamTest.scala:371) org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1.apply(StreamTest.scala:568) org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1.apply(StreamTest.scala:432) scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) == Progress == AddData to MemoryStream[value#19652]: 3,4,5 AddData to MemoryStream[value#19662]: 1,2,3 CheckLastBatch: [3,10,6,9] => AssertOnQuery(<condition>, Check total state rows = List(4), updated state rows = List(4)) AddData to MemoryStream[value#19652]: 20 AddData to MemoryStream[value#19662]: 21 CheckLastBatch: AddData to MemoryStream[value#19662]: 20 CheckLastBatch: [20,30,40,60],[4,10,8,null],[5,10,10,null] == Stream == Output Mode: Append Stream state: {MemoryStream[value#19652]: 0,MemoryStream[value#19662]: 0} Thread state: alive Thread stack trace: java.lang.Thread.sleep(Native Method) org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$runActivatedStream$1.apply$mcZ$sp(MicroBatchExecution.scala:152) org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56) org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:120) org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279) org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
No other failures in the history, though.