Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
0.6.0
-
None
-
None
-
None
Description
When I use the 0.6 AMI, start the Spark shell and try to load a dataset from S3 I get an out of memory error. It worked after I reduced the SPARK_MEM setting in the config.
Attachments
Issue Links
- duplicates
-
SPARK-671 Spark runs out of memory on fork/exec (affects both pipes and python)
- Resolved