Details

    • Type: Sub-task
    • Status: In Progress
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Spark Core
    • Labels:
      None

      Description

      Using https://github.com/JoshRosen/spark/tree/build-for-2.12, I tried running ClosureCleanerSuite with Scala 2.12 and ran into two bad test failures:

      [info] - toplevel return statements in closures are identified at cleaning time *** FAILED *** (32 milliseconds)
      [info]   Expected exception org.apache.spark.util.ReturnStatementInClosureException to be thrown, but no exception was thrown. (ClosureCleanerSuite.scala:57)
      

      and

      [info] - user provided closures are actually cleaned *** FAILED *** (56 milliseconds)
      [info]   Expected ReturnStatementInClosureException, but got org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: java.lang.Object
      [info]   	- element of array (index: 0)
      [info]   	- array (class "[Ljava.lang.Object;", size: 1)
      [info]   	- field (class "java.lang.invoke.SerializedLambda", name: "capturedArgs", type: "class [Ljava.lang.Object;")
      [info]   	- object (class "java.lang.invoke.SerializedLambda", SerializedLambda[capturingClass=class org.apache.spark.util.TestUserClosuresActuallyCleaned$, functionalInterfaceMethod=scala/runtime/java8/JFunction1$mcII$sp.apply$mcII$sp:(I)I, implementation=invokeStatic org/apache/spark/util/TestUserClosuresActuallyCleaned$.org$apache$spark$util$TestUserClosuresActuallyCleaned$$$anonfun$69:(Ljava/lang/Object;I)I, instantiatedMethodType=(I)I, numCaptured=1])
      [info]   	- element of array (index: 0)
      [info]   	- array (class "[Ljava.lang.Object;", size: 1)
      [info]   	- field (class "java.lang.invoke.SerializedLambda", name: "capturedArgs", type: "class [Ljava.lang.Object;")
      [info]   	- object (class "java.lang.invoke.SerializedLambda", SerializedLambda[capturingClass=class org.apache.spark.rdd.RDD, functionalInterfaceMethod=scala/Function3.apply:(Ljava/lang/Object;Ljava/lang/Object;Ljava/lang/Object;)Ljava/lang/Object;, implementation=invokeStatic org/apache/spark/rdd/RDD.org$apache$spark$rdd$RDD$$$anonfun$20$adapted:(Lscala/Function1;Lorg/apache/spark/TaskContext;Ljava/lang/Object;Lscala/collection/Iterator;)Lscala/collection/Iterator;, instantiatedMethodType=(Lorg/apache/spark/TaskContext;Ljava/lang/Object;Lscala/collection/Iterator;)Lscala/collection/Iterator;, numCaptured=1])
      [info]   	- field (class "org.apache.spark.rdd.MapPartitionsRDD", name: "f", type: "interface scala.Function3")
      [info]   	- object (class "org.apache.spark.rdd.MapPartitionsRDD", MapPartitionsRDD[2] at apply at Transformer.scala:22)
      [info]   	- field (class "scala.Tuple2", name: "_1", type: "class java.lang.Object")
      [info]   	- root object (class "scala.Tuple2", (MapPartitionsRDD[2] at apply at Transformer.scala:22,org.apache.spark.SparkContext$$Lambda$957/431842435@6e803685)).
      [info]   This means the closure provided by user is not actually cleaned. (ClosureCleanerSuite.scala:78)
      

      We'll need to figure out a closure cleaning strategy which works for 2.12 lambdas.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                joshrosen Josh Rosen
              • Votes:
                9 Vote for this issue
                Watchers:
                27 Start watching this issue

                Dates

                • Created:
                  Updated: