Hadoop Common
  1. Hadoop Common
  2. HADOOP-6820

RunJar fails executing thousands JARs within single JVM with error "Too many open files"

    Details

    • Type: Bug Bug
    • Status: Open
    • Priority: Minor Minor
    • Resolution: Unresolved
    • Affects Version/s: 0.20.2
    • Fix Version/s: None
    • Component/s: util
    • Labels:
      None
    • Environment:

      OS:Linux, Linux-user limited by maximum number of open file descriptors (for example: ulimit -n shows 1024)

      Description

      According to Sun JVM (up to 7) bug http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4167874 - The JarFile objects created by sun.net.www.protocol.jar.JarFileFactory never get garbage collected, even if the classloader that loaded them goes away.

      So, if linux-user has limitation on maximum number of open file descriptors (for example: ulimit -n shows 1024) and performs RunJar.main(...) over thousands of JARs that include other nested JARs (also loaded by ClassLoader) within single JVM, RunJar.main(...) throws following exception: java.lang.RuntimeException: java.io.FileNotFoundException: /some-file.txt (Too many open files)

      1. HADOOP-6820.patch
        7 kB
        Alexander Bondar

        Activity

        Alexander Bondar made changes -
        Field Original Value New Value
        Attachment HADOOP-6820.patch [ 12446864 ]
        Alexander Bondar created issue -

          People

          • Assignee:
            Unassigned
            Reporter:
            Alexander Bondar
          • Votes:
            2 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

            • Created:
              Updated:

              Development