Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
0.9.0
-
None
-
None
-
OS X 10.9.2, Java 1.7.0_51, Scala 2.10.3
Description
Using the spark-0.9.0 Hadoop2 binary from the project download page, running the spark-shell locally in out of the box configuration, and attempting to cache all the attached data, spark OOMs with: java.lang.OutOfMemoryError: GC overhead limit exceeded
You can work around the issue by either decreasing spark.storage.memoryFraction or increasing SPARK_MEM
Attachments
Issue Links
- duplicates
-
SPARK-1777 Pass "cached" blocks directly to disk if memory is not large enough
- Resolved