Description
This has come up a few times, from user venki-kratos:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-td2209.html
and I ran into it a few weeks ago:
http://mail-archives.apache.org/mod_mbox/spark-dev/201405.mbox/%3CCAMAsSdLzS6ihcTxepUsphRyXxA-wp26ZGBxx83sM6niRo0q4Rg@mail.gmail.com%3E
and yesterday user mpieck:
When I use the createStream method from the example class like
this:KafkaUtils.createStream(jssc, "zookeeper:port", "test", topicMap);
everything is working fine, but when I explicitely specify message decoder
classes used in this method with another overloaded createStream method:KafkaUtils.createStream(jssc, String.class, String.class,
StringDecoder.class, StringDecoder.class, props, topicMap,
StorageLevels.MEMORY_AND_DISK_2);the applications stops with an error:
14/06/10 22:28:06 ERROR kafka.KafkaReceiver: Error receiving data
java.lang.NoSuchMethodException:
java.lang.Object.<init>(kafka.utils.VerifiableProperties)
at java.lang.Class.getConstructor0(Unknown Source)
at java.lang.Class.getConstructor(Unknown Source)
at
org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:108)
at
org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInputDStream.scala:126)
Something is making it try to instantiate java.lang.Object as if it's a Decoder class.
I suspect that the problem is to do with
https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala#L148
implicit val keyCmd: Manifest[U] = implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[U]] implicit val valueCmd: Manifest[T] = implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[T]]
... where U and T are key/value Decoder types. I don't know enough Scala to fully understand this, but is it possible this causes the reflective call later to lose the type and try to instantiate Object? The AnyRef made me wonder.
I am sorry to say I don't have a PR to suggest at this point.