Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1392

Local spark-shell Runs Out of Memory With Default Settings

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 0.9.0
    • None
    • Spark Core
    • None
    • OS X 10.9.2, Java 1.7.0_51, Scala 2.10.3

    Description

      Using the spark-0.9.0 Hadoop2 binary from the project download page, running the spark-shell locally in out of the box configuration, and attempting to cache all the attached data, spark OOMs with: java.lang.OutOfMemoryError: GC overhead limit exceeded

      You can work around the issue by either decreasing spark.storage.memoryFraction or increasing SPARK_MEM

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              cheffpj Pat McDonough
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: