Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20589

Allow limiting task concurrency per stage

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.1.0
    • None
    • Scheduler, Spark Core

    Description

      It would be nice to have the ability to limit the number of concurrent tasks per stage. This is useful when your spark job might be accessing another service and you don't want to DOS that service. For instance Spark writing to hbase or Spark doing http puts on a service. Many times you want to do this without limiting the number of partitions.

      Attachments

        Activity

          People

            Unassigned Unassigned
            tgraves Thomas Graves
            Votes:
            7 Vote for this issue
            Watchers:
            18 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: