Details
-
Bug
-
Status: Resolved
-
P2
-
Resolution: Fixed
-
None
-
None
Description
The jackson version generated by the latest examples archetype is not compatible with the latest beam.
Even to execute the plain word-count example (generated by the archetype):
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.WordCount -Dexec.args="--inputFile=pom.xml --output=counts --runner=spark" -Pspark-runner
Is failing with:
11:58:48.231 [main] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Exception in thread "main" java.lang.RuntimeException: java.lang.ExceptionInInitializerError at org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:57) at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:74) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:110) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:98) at com.hortonworks.hcube.codestream.AsfStat.main(AsfStat.java:54) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) Caused by: java.lang.ExceptionInInitializerError at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.RDD.mapPartitions(RDD.scala:706) at org.apache.spark.api.java.JavaRDDLike$class.mapPartitionsToPair(JavaRDDLike.scala:194) at org.apache.spark.api.java.AbstractJavaRDDLike.mapPartitionsToPair(JavaRDDLike.scala:46) at org.apache.beam.runners.spark.translation.TransformTranslator$6.evaluate(TransformTranslator.java:356) at org.apache.beam.runners.spark.translation.TransformTranslator$6.evaluate(TransformTranslator.java:340) at org.apache.beam.runners.spark.SparkRunner$Evaluator.doVisitTransform(SparkRunner.java:409) at org.apache.beam.runners.spark.SparkRunner$Evaluator.visitPrimitiveTransform(SparkRunner.java:395) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:488) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483) at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483) at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$400(TransformHierarchy.java:232) at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:207) at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:383) at org.apache.beam.runners.spark.SparkRunner$2.run(SparkRunner.java:210) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.8 at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64) at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19) at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745) at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:81) at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) ... 21 more Add Comment
The longterm solution is to use filtering in the archetype and use the jackson version maven property.
Attachments
Issue Links
- links to