Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22727

spark.executor.instances's default value should be 2

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Not A Problem
    • 2.2.0
    • None
    • Spark Core, YARN
    • None

    Description

      spark.executor.instances's default value in running-on-yarn.md is 2,but in ExecutorAllocationManager.scala and org.apache.spark.util.Utils.scala it is used as the default value is 0.And I think it should be 2 for applications' initialization.

      Attachments

        Activity

          People

            Unassigned Unassigned
            liuzhaokun liuzhaokun
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: