Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
None
-
None
-
None
Description
Currently the only supported cluster schedulers are standalone, Mesos, Yarn and Simr. However if users like to build a new one it must be merged back into main, and might not be desirable for Spark and hard to iterate.
Instead, we should make a plugin architecture possible so that when users like to integrate with new scheduler it can plugged in via configuration and runtime loading instead.
Attachments
Attachments
Issue Links
- duplicates
-
SPARK-3561 Allow for pluggable execution contexts in Spark
- Resolved