Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-9621

Closure inside RDD doesn't properly close over environment

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.4.1
    • None
    • None
    • None
    • Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package

    Description

      I expect the following:

      case class MyTest(i: Int)
      val tv = MyTest(1)
      val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)

      to be "true." It is "false," when I type this into spark-shell. It seems the closure is changed somehow when it's serialized and deserialized.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              jnear Joe Near
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: