Hadoop Common
  1. Hadoop Common
  2. HADOOP-5887

Sqoop should create tables in Hive metastore after importing to HDFS


    • Type: New Feature New Feature
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.21.0
    • Component/s: None
    • Labels:
    • Hadoop Flags:
    • Release Note:
      New Sqoop argument --hive-import facilitates loading data into Hive.


      Sqoop (HADOOP-5815) imports tables into HDFS; it is a straightforward enhancement to then generate a Hive DDL statement to recreate the table definition in the Hive metastore and move the imported table into the Hive warehouse directory from its upload target.

      This feature enhancement makes this process automatic. An import is performed with sqoop in the usual way; providing the argument "--hive-import" will cause it to then issue a CREATE TABLE .. LOAD DATA INTO statement to a Hive shell. It generates a script file and then attempts to run "$HIVE_HOME/bin/hive" on it, or failing that, any "hive" on the $PATH; $HIVE_HOME can be overridden with --hive-home. As a result, no direct linking against Hive is necessary.

      The unit tests provided with this enhancement use a mock implementation of 'bin/hive' that compares the script it's fed with one from a directory full of "expected" scripts. The exact script file referenced is controlled via an environment variable. It doesn't actually load into a proper Hive metastore, but manual testing has shown that this process works in practice, so the mock implementation is a reasonable unit testing tool.

      1. HADOOP-5887.2.patch
        52 kB
        Aaron Kimball
      2. HADOOP-5887.patch
        52 kB
        Aaron Kimball

        Issue Links



            • Assignee:
              Aaron Kimball
              Aaron Kimball
            • Votes:
              1 Vote for this issue
              1 Start watching this issue


              • Created: