Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-3577

WebHdfsFileSystem can not read files larger than 24KB

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.23.3, 2.0.0-alpha
    • Fix Version/s: 0.23.3, 2.0.2-alpha
    • Component/s: webhdfs
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      If reading a file large enough for which the httpserver running webhdfs/httpfs uses chunked transfer encoding (more than 24K in the case of webhdfs), then the WebHdfsFileSystem client fails with an IOException with message Content-Length header is missing.

      It looks like WebHdfsFileSystem is delegating opening of the inputstream to ByteRangeInputStream.URLOpener class, which checks for the Content-Length header, but when using chunked transfer encoding the Content-Length header is not present and the URLOpener.openInputStream() method thrown an exception.

      1. h3577_20120705.patch
        4 kB
        Tsz Wo Nicholas Sze
      2. h3577_20120708.patch
        4 kB
        Tsz Wo Nicholas Sze
      3. h3577_20120714.patch
        9 kB
        Tsz Wo Nicholas Sze
      4. h3577_20120716.patch
        9 kB
        Tsz Wo Nicholas Sze
      5. h3577_20120717.patch
        7 kB
        Tsz Wo Nicholas Sze

        Issue Links

          Activity

            People

            • Assignee:
              Tsz Wo Nicholas Sze
              Reporter:
              Alejandro Abdelnur
            • Votes:
              0 Vote for this issue
              Watchers:
              18 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development