Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-4680

Job history cleaner should only check timestamps of files in old enough directories

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.0.0-alpha
    • Fix Version/s: 2.3.0
    • Component/s: jobhistoryserver
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      Job history files are stored in yyyy/mm/dd folders. Currently, the job history cleaner checks the modification date of each file in every one of these folders to see whether it's past the maximum age. The load on HDFS could be reduced by only checking the ages of files in directories that are old enough, as determined by their name.

        Attachments

        1. MAPREDUCE-4680.patch
          14 kB
          Robert Kanter
        2. MAPREDUCE-4680.patch
          14 kB
          Robert Kanter
        3. MAPREDUCE-4680.patch
          14 kB
          Robert Kanter
        4. MAPREDUCE-4680.patch
          15 kB
          Robert Kanter
        5. MAPREDUCE-4680.patch
          13 kB
          Robert Kanter

          Activity

            People

            • Assignee:
              rkanter Robert Kanter
              Reporter:
              sandyr Sandy Ryza
            • Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: