Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
0.6.2
-
None
-
None
-
None
Description
Currently SPARK_CONF_DIR is overridden in spark-config.sh, and start-slaves.sh doesn't allow the user to pass a -d option in to set the work directory. Allowing this is a small change and makes it possible to have multiple clusters running at once.
Attachments
Attachments
Issue Links
- duplicates
-
SPARK-4616 SPARK_CONF_DIR is not effective in spark-submit
- Resolved
- relates to
-
SPARK-3620 Refactor config option handling code for spark-submit
- Resolved
- links to