Details
-
New Feature
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
1.0.0
-
None
-
None
Description
Per the discussion here:
I would like to be able to use spark-ec2 to launch new slaves and add them to an existing, running cluster. Similarly, I would also like to remove slaves from an existing cluster.
Use cases include:
- Oh snap, I sized my cluster incorrectly. Let me add/remove some slaves.
- During scheduled batch processing, I want to add some new slaves, perhaps on spot instances. When that processing is done, I want to kill them. (Cruel, I know.)
I gather this is not possible at the moment. spark-ec2 appears to be able to launch new slaves for an existing cluster only if the master is stopped. I also do not see any ability to remove slaves from a cluster.
Attachments
Issue Links
- is blocked by
-
SPARK-5189 Reorganize EC2 scripts so that nodes can be provisioned independent of Spark master
- Resolved
- relates to
-
SPARK-3213 spark_ec2.py cannot find slave instances launched with "Launch More Like This"
- Resolved
- links to