Uploaded image for project: 'Hadoop YARN'
  1. Hadoop YARN
  2. YARN-1476

Container out of memery

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 2.2.0
    • None
    • nodemanager
    • None
    • mapreduce.reduce.java.opts=-Xmx4000m
      mapreduce.reduce.shuffle.merge.percent=0.4
      mapreduce.reduce.shuffle.parallelcopies=5
      mapreduce.reduce.shuffle.input.buffer.percent=0.6
      mapreduce.reduce.shuffle.memory.limit.percent=0.17

    Description

      when I input 60G of random word, I run wordcount job, the stage of shuffle is error. the reduce is run 13%.
      Container [pid=21073,containerID=container_1385657333160_0001_01_000073] is running beyond physical memory limits. Current usage: 4.0 GB of 4 GB physical memory used; 5.5 GB of 13 GB virtual memory used. Killing container.
      why do it need so much memory?

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              zjkyly zhoujunkun
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: