Uploaded image for project: 'Sqoop'
  1. Sqoop
  2. SQOOP-443

Calling sqoop with hive import is not working multiple times due to kept output directory

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 1.4.0-incubating, 1.4.1-incubating
    • Fix Version/s: 1.4.2
    • Component/s: None
    • Labels:
      None

      Description

      Hive is not removing input directory when doing "LOAD DATA" command in all cases. This input directory is actually sqoop's export directory. Because this directory is kept, calling same sqoop command twice is failing on exception "org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory $table already exists".

      This issue might be easily overcome by manual directory removal, however it's putting unnecessary burden on users. It's also complicating executing saved jobs as there is additional script execution needed.

        Attachments

        1. SQOOP-443.patch
          3 kB
          Jarek Jarcec Cecho
        2. SQOOP-443.patch
          3 kB
          Jarek Jarcec Cecho

          Activity

            People

            • Assignee:
              jarcec Jarek Jarcec Cecho
              Reporter:
              jarcec Jarek Jarcec Cecho
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: