Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: 0.22.0, 0.23.0
    • Fix Version/s: None
    • Component/s: mrv2, task
    • Labels:
      None

      Description

      My job is currently showing >100% reduce completion. Some reduce tasks are much higher than 100% complete. they appear to be in the "last merge pass" stage

        Issue Links

          Activity

          Hide
          raviprak Ravi Prakash added a comment -

          This seems to be a dup of MAPREDUCE-2264. Closing as such.

          Show
          raviprak Ravi Prakash added a comment - This seems to be a dup of MAPREDUCE-2264 . Closing as such.
          Hide
          andrewhancock Andrew Hancock added a comment -

          I am experiencing this bug in version 0.22 as well.

          It happens any time I turn on the following config in my job:

          config.setFloat("mapreduce.reduce.input.buffer.percent", 0.1f);

          If I remove this setting, the bug goes away.

          When this option is turned on, the following symptoms are exhibited:
          1. During the reduce phase many of the reduce tasks will quickly read > 100%. The Job will then report 100% complete.

          2. I have on occasion seen it then go back below 100% on the overall progress. It will oscillate between different numbers in that case.

          3. The job appears to complete normally in spite of the progress reporting issues.

          Show
          andrewhancock Andrew Hancock added a comment - I am experiencing this bug in version 0.22 as well. It happens any time I turn on the following config in my job: config.setFloat("mapreduce.reduce.input.buffer.percent", 0.1f); If I remove this setting, the bug goes away. When this option is turned on, the following symptoms are exhibited: 1. During the reduce phase many of the reduce tasks will quickly read > 100%. The Job will then report 100% complete. 2. I have on occasion seen it then go back below 100% on the overall progress. It will oscillate between different numbers in that case. 3. The job appears to complete normally in spite of the progress reporting issues.
          Hide
          tlipcon Todd Lipcon added a comment -

          This was on 0.23.0 before release, so might be fixed by now. I do think I was using intermediate compression and maybe some of the "in-memory merge" config options. I think this was the config I used for the job:

            <property>
              <name>io.sort.mb</name>
              <value>650</value>
            </property>
            <property>
              <name>mapreduce.map.sort.spill.percent</name>
              <value>0.98</value>
            </property>
            <property>
              <name>mapreduce.reduce.shuffle.input.buffer.percent</name>
              <value>0.8</value>
            </property>
            <property>
              <name>mapreduce.reduce.input.buffer.percent</name>
              <value>0.8</value>
            </property>
          
            <property>
              <name>io.sort.factor</name>
              <value>100</value>
            </property>
          

          and it was a terasort

          Show
          tlipcon Todd Lipcon added a comment - This was on 0.23.0 before release, so might be fixed by now. I do think I was using intermediate compression and maybe some of the "in-memory merge" config options. I think this was the config I used for the job: <property> <name>io.sort.mb</name> <value>650</value> </property> <property> <name>mapreduce.map.sort.spill.percent</name> <value>0.98</value> </property> <property> <name>mapreduce.reduce.shuffle.input.buffer.percent</name> <value>0.8</value> </property> <property> <name>mapreduce.reduce.input.buffer.percent</name> <value>0.8</value> </property> <property> <name>io.sort.factor</name> <value>100</value> </property> and it was a terasort
          Hide
          raviprak Ravi Prakash added a comment -

          Hi Todd!
          Any clues on how this may be reproduced?

          Show
          raviprak Ravi Prakash added a comment - Hi Todd! Any clues on how this may be reproduced?

            People

            • Assignee:
              raviprak Ravi Prakash
              Reporter:
              tlipcon Todd Lipcon
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development