Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-3678

The Map tasks logs should have the value of input split it processed

    Details

    • Type: New Feature
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.0.0, 2.0.0-alpha
    • Fix Version/s: 1.2.0, 2.0.3-alpha
    • Component/s: mrv1, mrv2
    • Labels:
      None
    • Hadoop Flags:
      Reviewed
    • Release Note:
      A map-task's syslogs now carries basic info on the InputSplit it processed.

      Description

      It would be easier to debug some corner in tasks if we knew what was the input split processed by that task. Map reduce task tracker log should accommodate the same. Also in the jobdetails web UI, the split also should be displayed along with the Split Locations.

      Sample as
      Input Split
      hdfs://myserver:9000/userdata/sampleapp/inputdir/file1.csv - <split no>/<offset from beginning of file>

      This would be much beneficial to nail down some data quality issues in large data volume processing.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                qwertymaniac Harsh J
                Reporter:
                bejoyks Bejoy KS
              • Votes:
                0 Vote for this issue
                Watchers:
                7 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: