Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-10660

Doc describe error in the "Running Spark on YARN" page

    Details

    • Type: Documentation
    • Status: Resolved
    • Priority: Trivial
    • Resolution: Fixed
    • Affects Version/s: 1.4.0, 1.4.1, 1.5.0
    • Fix Version/s: 1.4.2, 1.5.1, 1.6.0
    • Component/s: Documentation
    • Labels:
      None

      Description

      In the Configuration section, the spark.yarn.driver.memoryOverhead and spark.yarn.am.memoryOverhead‘s default value should be "driverMemory * 0.10, with minimum of 384" and "AM memory * 0.10, with minimum of 384" respectively. Because from Spark 1.4.0, the MEMORY_OVERHEAD_FACTOR is set to 0.1.0, not 0.07.

        Attachments

          Activity

            People

            • Assignee:
              397090770 wyp
              Reporter:
              397090770 wyp
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: