Uploaded image for project: 'Hadoop YARN'
  1. Hadoop YARN
  2. YARN-4714

[Java 8] Over usage of virtual memory

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Blocker
    • Resolution: Unresolved
    • None
    • None
    • None
    • None
    • Incompatible change
    • The default value of "yarn.nodemanager.vmem-check-enabled" was changed to false.

    Description

      In our Hadoop 2 + Java8 effort , we found few jobs are being Killed by Hadoop due to excessive virtual memory allocation. Although the physical memory usage is low.

      The most common error message is "Container [pid=??,containerID=container_??] is running beyond virtual memory limits. Current usage: 365.1 MB of 1 GB physical memory used; 3.2 GB of 2.1 GB virtual memory used. Killing container."

      We see this problem for MR job as well as in spark driver/executor.

      Attachments

        1. HADOOP-11364.01.patch
          2 kB
          Akira Ajisaka

        Issue Links

          Activity

            People

              kamrul Mohammad Islam
              kamrul Mohammad Islam
              Votes:
              5 Vote for this issue
              Watchers:
              37 Start watching this issue

              Dates

                Created:
                Updated: