Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-690

TestAppend2#testComplexAppend failed on "Too many open files"

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 0.21.0
    • 0.21.0
    • test
    • None
    • Reviewed

    Description

      the append write failed on "Too many open files":
      Some bytes were failed to append to a file on the following error:
      java.io.IOException: Cannot run program "stat": java.io.IOException: error=24, Too many open files
      at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
      at java.lang.Runtime.exec(Runtime.java:593)
      at java.lang.Runtime.exec(Runtime.java:466)
      at org.apache.hadoop.fs.FileUtil$HardLink.getLinkCount(FileUtil.java:644)
      at org.apache.hadoop.hdfs.server.datanode.ReplicaInfo.unlinkBlock(ReplicaInfo.java:205)
      at org.apache.hadoop.hdfs.server.datanode.FSDataset.append(FSDataset.java:1075)
      at org.apache.hadoop.hdfs.server.datanode.FSDataset.append(FSDataset.java:1058)
      at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.<init>(BlockReceiver.java:110)
      at org.apache.hadoop.hdfs.server.datanode.DataXceiver.opWriteBlock(DataXceiver.java:258)
      at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Receiver.opWriteBlock(DataTransferProtocol.java:382)
      at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$Receiver.processOp(DataTransferProtocol.java:323)
      at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:111)

      Attachments

        1. leakingThreads.patch
          2 kB
          Hairong Kuang
        2. leakingThreads1.patch
          2 kB
          Hairong Kuang

        Activity

          People

            hairong Hairong Kuang
            hairong Hairong Kuang
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: