Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-7316

Use of filter method in KTable.scala may result in StackOverflowError

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.0.0
    • Fix Version/s: 2.0.1, 2.1.0
    • Component/s: streams
    • Labels:

      Description

      In this thread:

      http://search-hadoop.com/m/Kafka/uyzND1dNbYKXzC4F1?subj=Issue+in+Kafka+2+0+0+

      Druhin reported seeing StackOverflowError when using filter method from KTable.scala

      This can be reproduced with the following change:

      diff --git a/streams/streams-scala/src/test/scala/org/apache/kafka/streams/scala/StreamToTableJoinScalaIntegrationTestImplicitSerdes.scala b/streams/streams-scala/src/test/scala
      index 3d1bab5..e0a06f2 100644
      --- a/streams/streams-scala/src/test/scala/org/apache/kafka/streams/scala/StreamToTableJoinScalaIntegrationTestImplicitSerdes.scala
      +++ b/streams/streams-scala/src/test/scala/org/apache/kafka/streams/scala/StreamToTableJoinScalaIntegrationTestImplicitSerdes.scala
      @@ -58,6 +58,7 @@ class StreamToTableJoinScalaIntegrationTestImplicitSerdes extends StreamToTableJ
           val userClicksStream: KStream[String, Long] = builder.stream(userClicksTopic)
      
           val userRegionsTable: KTable[String, String] = builder.table(userRegionsTopic)
      +    userRegionsTable.filter { case (_, count) => true }
      
           // Compute the total per region by summing the individual click counts per region.
           val clicksPerRegion: KTable[String, Long] =
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                joan@goyeau.com Joan Goyeau
                Reporter:
                yuzhihong@gmail.com Ted Yu
              • Votes:
                0 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: