Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-3788

distcp can't copy large files using webhdfs due to missing Content-Length header

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Critical Critical
    • Resolution: Fixed
    • Affects Version/s: 0.23.3, 2.0.0-alpha
    • Fix Version/s: 0.23.3, 2.0.2-alpha
    • Component/s: webhdfs
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      The following command fails when data1 contains a 3gb file. It passes when using hftp or when the directory just contains smaller (<2gb) files, so looks like a webhdfs issue with large files.

      hadoop distcp webhdfs://eli-thinkpad:50070/user/eli/data1 hdfs://localhost:8020/user/eli/data2

      1. 20120814NullEntity.patch
        11 kB
        Tsz Wo Nicholas Sze
      2. distcp-webhdfs-errors.txt
        12 kB
        Eli Collins
      3. h3788_20120813.patch
        4 kB
        Tsz Wo Nicholas Sze
      4. h3788_20120814.patch
        4 kB
        Tsz Wo Nicholas Sze
      5. h3788_20120814b.patch
        4 kB
        Tsz Wo Nicholas Sze
      6. h3788_20120815.patch
        4 kB
        Tsz Wo Nicholas Sze
      7. h3788_20120816.patch
        4 kB
        Tsz Wo Nicholas Sze

        Issue Links

          Activity

            People

            • Assignee:
              Tsz Wo Nicholas Sze
              Reporter:
              Eli Collins
            • Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development