Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5102

CompressedMapStatus needs to be registered with Kryo

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.2.0
    • 1.2.1, 1.3.0
    • None
    • None

    Description

      After upgrading from Spark 1.1.0 to 1.2.0 I got this exception:

      Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.scheduler.CompressedMapStatus
      Note: To register this class use: kryo.register(org.apache.spark.scheduler.CompressedMapStatus.class);
      	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:442)
      	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:79)
      	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:472)
      	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:565)
      	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:165)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:206)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      	at java.lang.Thread.run(Thread.java:745)
      

      I had to register org.apache.spark.scheduler.CompressedMapStatus with Kryo. I think this should be done in spark/serializer/KryoSerializer.scala, unless instances of this class are not expected to be sent over the wire. (Maybe I'm doing something wrong?)

      Attachments

        Activity

          People

            lianhuiwang Lianhui Wang
            darabos Daniel Darabos
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: