Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21243

Limit the number of maps in a single shuffle fetch

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.1.0, 2.1.1
    • Fix Version/s: 2.2.1, 2.3.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      Right now spark can limit the # of parallel fetches and also limits the amount of data in one fetch, but one fetch to a host could be for 100's of blocks. In one instance we saw 450+ blocks. When you have 100's of those and 1000's of reducers fetching that becomes a lot of metadata and can run the Node Manager out of memory. We should add a config to limit the # of maps per fetch to reduce the load on the NM.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Dhruve Ashar Dhruve Ashar
                Reporter:
                Dhruve Ashar Dhruve Ashar
                Shepherd:
                Thomas Graves
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: