Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-6820

RunJar fails executing thousands JARs within single JVM with error "Too many open files"

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 0.20.2
    • None
    • util
    • None
    • OS:Linux, Linux-user limited by maximum number of open file descriptors (for example: ulimit -n shows 1024)

    Description

      According to Sun JVM (up to 7) bug http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4167874 - The JarFile objects created by sun.net.www.protocol.jar.JarFileFactory never get garbage collected, even if the classloader that loaded them goes away.

      So, if linux-user has limitation on maximum number of open file descriptors (for example: ulimit -n shows 1024) and performs RunJar.main(...) over thousands of JARs that include other nested JARs (also loaded by ClassLoader) within single JVM, RunJar.main(...) throws following exception: java.lang.RuntimeException: java.io.FileNotFoundException: /some-file.txt (Too many open files)

      Attachments

        1. HADOOP-6820.patch
          7 kB
          Alexander Bondar

        Activity

          People

            Unassigned Unassigned
            abondar Alexander Bondar
            Votes:
            2 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: