Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-7199

Crash due to reuse of checksum files

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Not A Problem
    • 0.20.2
    • None
    • fs
    • None
    • Cloudera CDH3B4 in pseudo mode on a Linux 2.6.32-28-generic #55-Ubuntu SMP x86_64 kernel, with Java HotSpot64-Bit Server VM (build 19.1-b02, mixed mode)

    Description

      copyFromLocalFile crashes if a cheksum file exists on the local filesystem and the checksum does not match the file content. This will for example crash "hadoop -fs put ./foo ./foo" with a non-descriptive error.

      It is therefore not possible to do:

      1. copyToLocalFile(hdfsFile, localFile) // creates checksum file
      2. modify localFile
      3. copyFromLocalFile(localFile, hdfsFile) // uses old checksum

      Solution: do not reuse checksum files, or add a parameter to copyFromLocalFile that specifies that checksum files should not be reused.

      Attachments

        Activity

          People

            Unassigned Unassigned
            larsab Lars Ailo Bongo
            Votes:
            1 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: