Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-11711

DN should not delete the block On "Too many open files" Exception

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • None
    • 2.9.0, 3.0.0-alpha4, 2.8.2
    • datanode
    • None
    • Reviewed

    Description

      Seen the following scenario in one of our customer environment

      • while jobclient writing "job.xml" there are pipeline failures and written to only one DN.
      • when mapper reading the "job.xml", DN got "Too many open files" (as system exceed limit) and block got deleted. Hence mapper failed to read and job got failed.

      Attachments

        1. HDFS-11711.patch
          1 kB
          Brahma Reddy Battula
        2. HDFS-11711-002.patch
          7 kB
          Brahma Reddy Battula
        3. HDFS-11711-branch-2-002.patch
          7 kB
          Brahma Reddy Battula
        4. HDFS-11711-003.patch
          7 kB
          Brahma Reddy Battula
        5. HDFS-11711-004.patch
          7 kB
          Brahma Reddy Battula
        6. HDFS-11711-branch-2-003.patch
          7 kB
          Brahma Reddy Battula

        Issue Links

          Activity

            People

              brahmareddy Brahma Reddy Battula
              brahmareddy Brahma Reddy Battula
              Votes:
              0 Vote for this issue
              Watchers:
              15 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: