Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
2.4.0
-
None
-
None
Description
Package org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0 is no longer compatible with org.apache.kafka:kafka_2.11:0.10.0.1.
When both packages are used in the same project, the following exception occurs:
java.lang.NoClassDefFoundError: org/apache/kafka/common/protocol/SecurityProtocol at kafka.server.Defaults$.<init>(KafkaConfig.scala:125) at kafka.server.Defaults$.<clinit>(KafkaConfig.scala) at kafka.log.Defaults$.<init>(LogConfig.scala:33) at kafka.log.Defaults$.<clinit>(LogConfig.scala) at kafka.log.LogConfig$.<init>(LogConfig.scala:152) at kafka.log.LogConfig$.<clinit>(LogConfig.scala) at kafka.server.KafkaConfig$.<init>(KafkaConfig.scala:265) at kafka.server.KafkaConfig$.<clinit>(KafkaConfig.scala) at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:759) at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:761)
This exception is caused by incompatible dependency pulled by Spark: org.apache.kafka:kafka-clients_2.11:2.0.0.
Following workaround could be used to resolve the problem in my project:
dependencyOverrides += "org.apache.kafka" % "kafka-clients" % "0.10.0.1"
Attachments
Issue Links
- is caused by
-
SPARK-18057 Update structured streaming kafka from 0.10.0.1 to 2.0.0
- Resolved