Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-7066

Make Streams Runtime Error User Friendly in Case of Serialisation exception

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.1.0
    • Fix Version/s: 2.0.0
    • Component/s: streams
    • Labels:
      None

      Description

      This kind of exception can be cryptic for the beginner:

      ERROR stream-thread [favourite-colors-application-a336770d-6ba6-4bbb-8681-3c8ea91bd12e-StreamThread-1] Failed to process stream task 2_0 due to the following error: (org.apache.kafka.streams.processor.internals.AssignedStreamsTasks:105)
      java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.String
      at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28)
      at org.apache.kafka.streams.state.StateSerdes.rawValue(StateSerdes.java:178)
      at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore$1.innerValue(MeteredKeyValueBytesStore.java:66)
      at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore$1.innerValue(MeteredKeyValueBytesStore.java:57)
      at org.apache.kafka.streams.state.internals.InnerMeteredKeyValueStore.put(InnerMeteredKeyValueStore.java:198)
      at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore.put(MeteredKeyValueBytesStore.java:117)
      at org.apache.kafka.streams.kstream.internals.KTableAggregate$KTableAggregateProcessor.process(KTableAggregate.java:95)
      at org.apache.kafka.streams.kstream.internals.KTableAggregate$KTableAggregateProcessor.process(KTableAggregate.java:56)
      at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:46)
      at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:208)
      at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:124)
      at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.forward(AbstractProcessorContext.java:174)
      at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:80)
      at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:224)
      at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:94)
      at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:411)
      at org.apache.kafka.streams.processor.internals.StreamThread.processAndMaybeCommit(StreamThread.java:918)
      at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:798)
      at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:750)
      at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:720)

      We should add more detailed logging already present in SinkNode to assist the user into solving this issue

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                stephane.maarek@gmail.com Stephane Maarek
                Reporter:
                stephane.maarek@gmail.com Stephane Maarek
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: