Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2946

Allow specifying * for --num-executors in YARN

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 1.0.0
    • None
    • Spark Core
    • None
    • Ubuntu precise, on YARN (CDH 5.1.0)

    Description

      It would be useful to allow specifying --num-executors * when submitting jobs to YARN, and to have Spark automatically determine how many total cores are available in the cluster by querying YARN.

      Our scenario is multiple users running research batch jobs. We never want to have a situation where cluster resources aren't being used, so ideally users would specify * and let YARN scheduling and preemption ensure fairness.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              roji Shay Rojansky
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: