Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-41585

The Spark exclude node functionality for YARN should work independently of dynamic allocation

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.0.3, 3.1.3, 3.2.2, 3.3.1
    • 3.5.0
    • YARN
    • None

    Description

      The Spark exclude node functionality for Spark on YARN, introduced in SPARK-26688, allows users to specify a list of node names that are excluded from resource allocation. This is done using the configuration parameter: spark.yarn.exclude.nodes

      The feature currently works only for executors allocated via dynamic allocation. To use the feature on Spark 3.3.1, for example, one may set the configurations spark.dynamicAllocation.enabled=true, spark.dynamicAllocation.minExecutors=0 and spark.executor.instances=0, thus making Spark spawning executors only via dynamic allocation.

      This proposes to document this behavior for the current Spark release and also proposes an improvement of this feature by extending the scope of Spark exclude node functionality for YARN beyond dynamic allocation, which I believe makes it more generally useful.

      Attachments

        Activity

          People

            lucacanali Luca Canali
            lucacanali Luca Canali
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: