Uploaded image for project: 'Flume'
  1. Flume
  2. FLUME-2245

HDFS files with errors unable to close

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.6.0
    • Component/s: None
    • Labels:
      None

      Description

      This is running on a snapshot of Flume-1.5 with the git hash 99db32ccd163daf9d7685f0e8485941701e1133d

      When a datanode goes unresponsive for a significant amount of time(for example a big gc) an append failure will occur followed by repeated time outs appearing in the log, and failure to close the stream. Relevant section of logs attached(where it first starts appearing.

      The same log repeats periodically, consistently running into a TimeoutException.

      Restarting flume(or presumably just the HDFSSink) solves the issue.

      Probable cause in comments

        Attachments

        1. FLUME-2245.patch
          0.9 kB
          Brock Noland
        2. flume.log.file
          81 kB
          Juhani Connolly
        3. flume.log.1133
          41 kB
          Juhani Connolly

          Issue Links

            Activity

              People

              • Assignee:
                brocknoland Brock Noland
                Reporter:
                juhanic Juhani Connolly
              • Votes:
                0 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: