Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21243

Limit the number of maps in a single shuffle fetch

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.1.0, 2.1.1
    • 2.2.1, 2.3.0
    • Spark Core
    • None

    Description

      Right now spark can limit the # of parallel fetches and also limits the amount of data in one fetch, but one fetch to a host could be for 100's of blocks. In one instance we saw 450+ blocks. When you have 100's of those and 1000's of reducers fetching that becomes a lot of metadata and can run the Node Manager out of memory. We should add a config to limit the # of maps per fetch to reduce the load on the NM.

      Attachments

        Issue Links

          Activity

            People

              Dhruve Ashar Dhruve Ashar
              Dhruve Ashar Dhruve Ashar
              Thomas Graves Thomas Graves
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: