Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11882

Allow for running Spark applications against a custom coarse grained scheduler

    XMLWordPrintableJSON

    Details

    • Type: Wish
    • Status: Resolved
    • Priority: Minor
    • Resolution: Duplicate
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Spark Core, Spark Submit
    • Labels:
      None

      Description

      SparkContext makes a decision which scheduler to use according to the Master URI. How about running applications against a custom scheduler? Such a custom scheduler would just extend CoarseGrainedSchedulerBackend.

      The custom scheduler would be created by a provided factory. Factories would be defined in the configuration like spark.scheduler.factory.<name>=<factory-class>, where name is the scheduler name. SparkContext, once it learns that master address is not for standalone, Yarn, Mesos, local or any other predefined scheduler, it would resolve scheme from the provided master URI and look for the scheduler factory with the name equal to the resolved scheme.

      For example:
      spark.scheduler.factory.custom=org.a.b.c.CustomSchedulerFactory
      then Master address would be custom://192.168.1.1

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                jlewandowski Jacek Lewandowski
              • Votes:
                0 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: