Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-3096

Add a good way to control the number of map/reduce tasks per node

    XMLWordPrintableJSON

Details

    • Task
    • Status: Resolved
    • Major
    • Resolution: Won't Fix
    • None
    • 0.20.204.0
    • None
    • None

    Description

      Currently, controlling the number of map/reduce tasks is a hell.

      I've tried for it many times, and it doesn't work right. Also, I am not the only one person, who seems to have this problem.

      There must be a better way to do it.

      Here's my proposal:

      add following functions to Job:
      setNumberOfMappersPerNode(int);
      setNumberOfReducersPerNode(int);
      setMaxMemoryPerMapper(int);
      setMaxMemoryPerReducer(int);

      Attachments

        Activity

          People

            Unassigned Unassigned
            menkaur Arsen Zahray
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: