Uploaded image for project: 'Sqoop'
  1. Sqoop
  2. SQOOP-2192

SQOOP IMPORT/EXPORT for the ORC file HIVE TABLE Failing

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 1.4.5
    • Fix Version/s: None
    • Component/s: hive-integration
    • Labels:
      None
    • Environment:

      Hadoop 2.6.0
      Hive 1.0.0
      Sqoop 1.4.5

      Description

      We are trying to export RDMB table to Hive table for running Hive delete, update queries on exported Hive table. Since for the Hive to support delete, update queries on following is required:
      1. Needs to declare table as having Transaction Property
      2. Table must be in ORC format
      3. Tables must to be bucketed
      to do that i have create hive table using hcat:
      create table bookinfo(md5 STRING , isbn STRING , bookid STRING , booktitle STRING , author STRING , yearofpub STRING , publisher STRING , imageurls STRING , imageurlm STRING , imageurll STRING , price DOUBLE , totalrating DOUBLE , totalusers BIGINT , maxrating INT , minrating INT , avgrating DOUBLE , rawscore DOUBLE , norm_score DOUBLE) clustered by (md5) into 10 buckets stored as orc TBLPROPERTIES('transactional'='true');

      then running sqoop import:
      sqoop import --verbose --connect 'RDBMS_JDBC_URL' --driver JDBC_DRIVER --table bookinfo --null-string '
      N' --null-non-string '
      N' --username USER --password PASSWPRD --hcatalog-database hive_test_trans --hcatalog-table bookinfo --hcatalog-storage-stanza "storedas orc" -m 1

      Following exception is comming:
      15/03/09 16:28:59 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not supported : Store into a partition with bucket definition from Pig/Mapreduce is not supported
      at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:109)
      at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
      at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:339)
      at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:753)
      at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
      at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:240)
      at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:665)
      at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
      at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
      at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
      at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
      at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
      at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

      Please let any futher details required.

        Attachments

          Activity

            People

            • Assignee:
              venkatnrangan Venkat Ranganathan
              Reporter:
              suniluiit Sunil Kumar
            • Votes:
              6 Vote for this issue
              Watchers:
              15 Start watching this issue

              Dates

              • Created:
                Updated: