Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11130

TestHive fails on machines with few cores

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 1.5.0, 1.6.0
    • None
    • SQL
    • None

    Description

      Filing so it doesn't get lost (again).

      TestHive.scala has this code:

          new SparkContext(
            System.getProperty("spark.sql.test.master", "local[32]"),
      

      On machines with less cores, that causes many tests to fail with "unable to allocate memory" errors, because the default page size calculation seems to be based on the machine's core count, and not on the core count specified for the SparkContext.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              vanzin Marcelo Masiero Vanzin
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: