Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-13904

Add support for pluggable cluster manager

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 2.0.0
    • Scheduler, Spark Core
    • None

    Description

      Currently Spark allows only a few cluster managers viz Yarn, Mesos and Standalone. But, as Spark is now being used in newer and different use cases, there is a need for allowing other cluster managers to manage spark components. One such use case is - embedding spark components like executor and driver inside another process which may be a datastore. This allows colocation of data and processing. Another requirement that stems from such a use case is that the executors/driver should not take the parent process down when they go down and the components can be relaunched inside the same process again.

      So, this JIRA requests two functionalities:
      1. Support for external cluster managers
      2. Allow a cluster manager to clean up the tasks without taking the parent process down.

      Attachments

        Issue Links

          Activity

            People

              hbhanawat Hemant Bhanawat
              hbhanawat Hemant Bhanawat
              Votes:
              1 Vote for this issue
              Watchers:
              10 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: