Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-3051

DataXceiver: java.io.IOException: Too many open files

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Duplicate
    • 0.17.0
    • None
    • None
    • None

    Description

      I just ran an experiment with the latest nightly build hadoop-2008-03-15 available and after 2 minutes I'm getting a tons of "java.io.IOException: Too many open files" exceptions as shown here:

       2008-03-19 20:08:09,303 ERROR org.apache.hadoop.dfs.DataNode: 
      141.30.xxx.xxx:50010:DataXceiver: java.io.IOException: Too many open files
           at sun.nio.ch.IOUtil.initPipe(Native Method)
           at sun.nio.ch.EPollSelectorImpl.<init>(Unknown Source)
           at sun.nio.ch.EPollSelectorProvider.openSelector(Unknown Source)
           at sun.nio.ch.Util.getTemporarySelector(Unknown Source)
           at sun.nio.ch.SocketAdaptor.connect(Unknown Source)
           at org.apache.hadoop.dfs.DataNode$DataXceiver.writeBlock(DataNode.java:1114)
           at org.apache.hadoop.dfs.DataNode$DataXceiver.run(DataNode.java:956)
           at java.lang.Thread.run(Unknown Source)

      I ran the same experiment with same high workload (50 dfs clients with 40 streams each writing concurrently files on a 8 nodes DFS cluster) with the 0.16.1 release and no exception is thrown. So it looks like a bug to me...

      Attachments

        Issue Links

          Activity

            People

              rangadi Raghu Angadi
              andremartin André Martin
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: