Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-5856

Counter limits always use defaults even if JobClient is given a different Configuration

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 2.3.0, 2.4.0
    • None
    • client
    • None

    Description

      If you have a job with more than the default number of counters (i.e. > 120), and you create a JobClient with a Configuration where the default is increased (e.g. 500), then JobClient will throw this Exception:

      org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 121 max=120
      

      Attachments

        1. MAPREDUCE-5856.patch
          1 kB
          Robert Kanter

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            rkanter Robert Kanter
            rkanter Robert Kanter
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment