Affects Version/s: 1.6.0
Fix Version/s: None
While spark is well documented for the most part, often times I have trouble determining where a configuration applies.
For example, when setting spark.dynamicAllocation.enabled , does it always apply to the entire cluster manager, or is it possible to configure it on a per-job level?
Different levels I can think of:
And I'm sure there are more. This could be just another column in the configuration page.