Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-6107

When a block can't be cached due to limited space on the DataNode, that block becomes uncacheable

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.4.0
    • Fix Version/s: 2.4.0
    • Component/s: datanode
    • Labels:
      None
    • Target Version/s:

      Description

      When a block can't be cached due to limited space on the DataNode, that block becomes uncacheable. This is because the CachingTask fails to reset the block state in this error handling case.

        Attachments

        1. HDFS-6107.001.patch
          11 kB
          Colin P. McCabe

          Activity

            People

            • Assignee:
              cmccabe Colin P. McCabe
              Reporter:
              cmccabe Colin P. McCabe
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: