Uploaded image for project: 'Tika'
  1. Tika
  2. TIKA-2123

CommonsDigester calculates wrong hashes on large files

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.13
    • 1.14, 2.0.0
    • metadata
    • None

    Description

      When passing more than one algorithm to CommonsDigester constructor and
      then trying to digest a file which is larger than 7.5 MB, results wrong
      hashe calculation for all the algorithms except the first.

      The next code will reproduce the bug:

      // The file that was used w as a simple plain text file with size > 7.5 MB
      File file = new File("testLargeFile.txt");

      BufferedInputStream bufferedInputStream = new BufferedInputStream(new FileInputStream(file));

      Metadata metadata = new Metadata();

      CommonsDigester digester = new CommonsDigester(20000000,
      CommonsDigester.DigestAlgorithm.MD5,
      CommonsDigester.DigestAlgorithm.SHA1,
      CommonsDigester.DigestAlgorithm.SHA256);

      digester.digest(bufferedInputStream, metadata, null);

      // Will print correct MD5 but wrong SHA1 and wrong SHA256
      System.out.println(metadata);

      Initial direction: it seems that the inner buffered stream that is being used doesn't reset to 0 position after the first algorithm.

      Attachments

        Activity

          People

            Unassigned Unassigned
            yahavamsi Yahav Amsalem
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: