Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-6864

Hadoop streaming creates 2 mappers when the input has only one block

Add voteVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 2.7.3
    • None
    • mrv2
    • None

    Description

      If a streaming job is run against input that is less than 2 blocks, 2 mappers will be created, both operating on the same split, both producing (duplicate) output. In some cases the second mapper will consistently fail. I've not seen the failure with input less than 10 bytes or more than a couple MB. I have seen it with a 4kB input.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            templedf Daniel Templeton

            Dates

              Created:
              Updated:

              Slack

                Issue deployment