Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 2.0.1, 2.1.0
    • Block Manager, Spark Core
    • None

    Description

      A simple fix is to erase the classTag when using the default serializer, since it's not needed in that case, and the classTag was failing to deserialize on the remote end.

      The proper fix is actually to use the right classloader when deserializing the classtags, but that is a much more invasive change for 2.0.

      The following test can be added to ReplSuite to reproduce the bug:

        test("replicating blocks of object with class defined in repl") {
          val output = runInterpreter("local-cluster[2,1,1024]",
            """
              |import org.apache.spark.storage.StorageLevel._
              |case class Foo(i: Int)
              |val ret = sc.parallelize((1 to 100).map(Foo), 10).persist(MEMORY_ONLY_2)
              |ret.count()
              |sc.getExecutorStorageStatus.map(s => s.rddBlocksById(ret.id).size).sum
            """.stripMargin)
          assertDoesNotContain("error:", output)
          assertDoesNotContain("Exception", output)
          assertContains(": Int = 20", output)
        }
      

      Attachments

        Activity

          People

            ekhliang Eric Liang
            ekhliang Eric Liang
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: