Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-6107

When a block can't be cached due to limited space on the DataNode, that block becomes uncacheable

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 2.4.0
    • 2.4.0
    • datanode
    • None

    Description

      When a block can't be cached due to limited space on the DataNode, that block becomes uncacheable. This is because the CachingTask fails to reset the block state in this error handling case.

      Attachments

        1. HDFS-6107.001.patch
          11 kB
          Colin McCabe

        Activity

          People

            cmccabe Colin McCabe
            cmccabe Colin McCabe
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: