Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5112

Expose SizeEstimator as a developer API

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • None
    • 1.4.0
    • Spark Core
    • None

    Description

      "The best way to size the amount of memory consumption your dataset will require is to create an RDD, put it into cache, and look at the SparkContext logs on your driver program. The logs will tell you how much memory each partition is consuming, which you can aggregate to get the total size of the RDD."
      -the Tuning Spark page

      This is a pain. It would be much nicer to expose simply functionality for understanding the memory footprint of a Java object.

      Attachments

        Activity

          People

            sandyr Sandy Ryza
            sandyr Sandy Ryza
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: