Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-11498

Spark integration tests on Go SDK failing.

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: P2
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: cross-language, sdk-go
    • Labels:
      None

      Description

      Configuration that it's failing in:

      • Go Pipelines
      • Spark Runner Job Server (Java)
      • Java Test Expansion Service

      Specifically, it's failing when being run through the new ValidatesRunner framework.

      Edit: Looks like TestParDoSideInput and TestParDoKVSideInput are also failing with the same error, so this seems like it's not a cross language issue.

      Error:

      20/12/16 20:46:44 ERROR org.apache.beam.runners.jobsubmission.JobInvocation: Error during job invocation go0job0401608180402898378125-danoliveira-1217044644-6d1b1ec6_da728c2e-3dd7-4420-8fae-1cd3a47094b1.
      java.lang.IllegalArgumentException: Unsupported class file major version 55
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
              at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:50)
              at org.apache.spark.util.FieldAccessFinder$$anon$4$$anonfun$visitMethodInsn$7.apply(ClosureCleaner.scala:845)
              at org.apache.spark.util.FieldAccessFinder$$anon$4$$anonfun$visitMethodInsn$7.apply(ClosureCleaner.scala:828)
              at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
              at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
              at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
              at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
              at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
              at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
              at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
              at org.apache.spark.util.FieldAccessFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:828)
              at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
              at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
              at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
              at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
              at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:272)
              at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:271)
              at scala.collection.immutable.List.foreach(List.scala:392)
              at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:271)
              at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:163)
              at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
              at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
              at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
              at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990)
              at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
              at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
              at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
              at org.apache.spark.rdd.RDD.collect(RDD.scala:989)
              at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
              at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
              at org.apache.beam.runners.spark.translation.BoundedDataset.getBytes(BoundedDataset.java:79)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInput(SparkBatchPortablePipelineTranslator.java:363)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInputs(SparkBatchPortablePipelineTranslator.java:347)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translateExecutableStage(SparkBatchPortablePipelineTranslator.java:225)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translate(SparkBatchPortablePipelineTranslator.java:147)
      2020/12/16 20:46:44  (): java.lang.IllegalArgumentException: Unsupported class file major version 55
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
              at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
              at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:50)
              at org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$2(SparkPipelineRunner.java:196)
              at org.apache.spark.util.FieldAccessFinder$$anon$4$$anonfun$visitMethodInsn$7.apply(ClosureCleaner.scala:845)
              at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
              at org.apache.spark.util.FieldAccessFinder$$anon$4$$anonfun$visitMethodInsn$7.apply(ClosureCleaner.scala:828)
              at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
              at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
              at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
              at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
              at java.base/java.lang.Thread.run(Thread.java:835)
              at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
              at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
              at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
              at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
              at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
              at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
              at org.apache.spark.util.FieldAccessFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:828)
              at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
              at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
              at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
              at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
              at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:272)
              at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:271)
              at scala.collection.immutable.List.foreach(List.scala:392)
              at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:271)
              at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:163)
              at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
              at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
              at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
              at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990)
              at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
              at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
              at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
              at org.apache.spark.rdd.RDD.collect(RDD.scala:989)
              at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
              at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
              at org.apache.beam.runners.spark.translation.BoundedDataset.getBytes(BoundedDataset.java:79)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInput(SparkBatchPortablePipelineTranslator.java:363)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInputs(SparkBatchPortablePipelineTranslator.java:347)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translateExecutableStage(SparkBatchPortablePipelineTranslator.java:225)
              at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translate(SparkBatchPortablePipelineTranslator.java:147)
              at org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$2(SparkPipelineRunner.java:196)
              at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
              at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
              at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
              at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
              at java.base/java.lang.Thread.run(Thread.java:835)
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                danoliveira Daniel Oliveira
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated: