Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-32

Creating job with InputDir set to non-existant directory locks up jobtracker

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Duplicate
    • None
    • 0.4.0
    • None
    • None
    • hadoop-trunk, running distributed map reduce

    Description

      This, in the very least, affects anything using the default listFiles() from InputFormatBase. If no files are enumerated, an exception is thrown... but the JobTracker keeps attempting to run listFiles() for this job. Trying to stop the job with hadoop job -kill job_name just results in timeouts, and further started jobs also don't progress. This happens every single time with, say, "wordcount", and a non-existent input path.

      Attachments

        Activity

          People

            omalley Owen O'Malley
            bpendleton Bryan Pendleton
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: