Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
1.5.0, 1.6.0
-
None
-
None
Description
Filing so it doesn't get lost (again).
TestHive.scala has this code:
new SparkContext( System.getProperty("spark.sql.test.master", "local[32]"),
On machines with less cores, that causes many tests to fail with "unable to allocate memory" errors, because the default page size calculation seems to be based on the machine's core count, and not on the core count specified for the SparkContext.
Attachments
Issue Links
- duplicates
-
SPARK-11251 Page size calculation is wrong in local mode
- Resolved