Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-3678

The Map tasks logs should have the value of input split it processed

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.0.0, 2.0.0-alpha
    • 1.2.0, 2.0.3-alpha
    • mrv1, mrv2
    • None
    • Reviewed
    • A map-task's syslogs now carries basic info on the InputSplit it processed.

    Description

      It would be easier to debug some corner in tasks if we knew what was the input split processed by that task. Map reduce task tracker log should accommodate the same. Also in the jobdetails web UI, the split also should be displayed along with the Split Locations.

      Sample as
      Input Split
      hdfs://myserver:9000/userdata/sampleapp/inputdir/file1.csv - <split no>/<offset from beginning of file>

      This would be much beneficial to nail down some data quality issues in large data volume processing.

      Attachments

        1. MAPREDUCE-3678.patch
          1 kB
          Harsh J
        2. MAPREDUCE-3678-branch-1.patch
          0.9 kB
          Harsh J

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            qwertymaniac Harsh J
            bejoyks Bejoy KS
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Issue deployment