Hadoop Map/Reduce
  1. Hadoop Map/Reduce
  2. MAPREDUCE-5856

Counter limits always use defaults even if JobClient is given a different Configuration

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Duplicate
    • Affects Version/s: 2.3.0, 2.4.0
    • Fix Version/s: None
    • Component/s: client
    • Labels:
      None

      Description

      If you have a job with more than the default number of counters (i.e. > 120), and you create a JobClient with a Configuration where the default is increased (e.g. 500), then JobClient will throw this Exception:

      org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 121 max=120
      

        Issue Links

          Activity

            People

            • Assignee:
              Robert Kanter
              Reporter:
              Robert Kanter
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development