Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-13007

Document where configuration / properties are read and applied

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 1.6.0
    • Fix Version/s: None
    • Component/s: Documentation
    • Labels:
      None

      Description

      While spark is well documented for the most part, often times I have trouble determining where a configuration applies.

      For example, when setting spark.dynamicAllocation.enabled , does it always apply to the entire cluster manager, or is it possible to configure it on a per-job level?

      Different levels I can think of:
      Application
      Driver
      Executor
      Worker
      Cluster

      And I'm sure there are more. This could be just another column in the configuration page.

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              abraithwaite Alan Braithwaite
            • Votes:
              2 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated: