Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20580

Allow RDD cache with unserializable objects

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Not A Problem
    • 1.3.0
    • None
    • Spark Core
    • None

    Description

      In my current scenario we load complex Python objects in the worker nodes that are not completely serializable. We then apply map certain operations to the RDD which at some point we collect. In this basic usage all works well.

      However, if we cache() the RDD (which defaults to memory) suddenly it fails to execute the transformations after the caching step. Apparently caching serializes the RDD data and deserializes it whenever more transformations are required.

      It would be nice to avoid serialization of the objects if they are to be cached to memory, and keep the original object

      Attachments

        Activity

          People

            Unassigned Unassigned
            ferdonline Fernando Pereira
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: