Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10660

Doc describe error in the "Running Spark on YARN" page

    XMLWordPrintableJSON

Details

    • Documentation
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 1.4.0, 1.4.1, 1.5.0
    • 1.4.2, 1.5.1, 1.6.0
    • Documentation
    • None

    Description

      In the Configuration section, the spark.yarn.driver.memoryOverhead and spark.yarn.am.memoryOverhead‘s default value should be "driverMemory * 0.10, with minimum of 384" and "AM memory * 0.10, with minimum of 384" respectively. Because from Spark 1.4.0, the MEMORY_OVERHEAD_FACTOR is set to 0.1.0, not 0.07.

      Attachments

        Activity

          People

            397090770 iteblog
            397090770 iteblog
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: