Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-25567

Cannot import table from Tibero to Hive.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 3.1.0
    • 3.1.3
    • CLI, Hive

    Description

      I am trying to import table from Tibero RDMS to Hive table. I am able use sqoop eval, sqoop list-tables, sqoop list-databases but sqoop import is giving below error.

      ```

      sqoop import "-Dorg.apache.sqoop.splitter.allow_text_splitter=true" \
      > --connect "jdbc:tibero:thin:@hostname:8629:DB" \
      > --driver com.tmax.tibero.jdbc.TbDriver \
      > --username XXX --password XXX@2021 \
      > --split-by 'DORG_LATITUDE' \
      > --table DMSDBA.CMM_CADORG_TB \
      > --fields-terminated-by "," \
      > --hive-import \
      > --create-hive-table \
      > --hive-table DMSDBA_raw.CMM_CADORG_TB2\
      > --hive-overwrite
      Warning: /usr/hdp/3.0.1.0-187/accumulo does not exist! Accumulo imports will fail.
      Please set $ACCUMULO_HOME to the root of your Accumulo installation.
      SLF4J: Class path contains multiple SLF4J bindings.
      SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      21/09/28 09:54:10 INFO sqoop.Sqoop: Running Sqoop version: 1.4.8.3.0.1.0-187
      21/09/28 09:54:10 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
      21/09/28 09:54:10 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
      21/09/28 09:54:10 INFO manager.SqlManager: Using default fetchSize of 1000
      21/09/28 09:54:10 INFO tool.CodeGenTool: Beginning code generation
      21/09/28 09:54:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM DMSDBA.CMM_CADORG_TB AS t WHERE 1=0
      21/09/28 09:54:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM DMSDBA.CMM_CADORG_TB AS t WHERE 1=0
      21/09/28 09:54:10 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.0.1.0-187/hadoop-mapreduce
      21/09/28 09:54:12 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/159d1a61233887f50112dd4fb3fb129b/DMSDBA.CMM_CADORG_TB.jar
      21/09/28 09:54:13 INFO mapreduce.ImportJobBase: Beginning import of DMSDBA.CMM_CADORG_TB
      21/09/28 09:54:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM DMSDBA.CMM_CADORG_TB AS t WHERE 1=0
      21/09/28 09:54:14 INFO client.AHSProxy: Connecting to Application History server at itisbdp/10.107.7.20:10200
      21/09/28 09:54:14 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/hdfs/.staging/job_1630059382070_0035
      21/09/28 09:54:16 INFO db.DBInputFormat: Using read commited transaction isolation
      21/09/28 09:54:16 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(DORG_LATITUDE), MAX(DORG_LATITUDE) FROM DMSDBA.CMM_CADORG_TB
      21/09/28 09:54:16 WARN db.TextSplitter: Generating splits for a textual index column.
      21/09/28 09:54:16 WARN db.TextSplitter: If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.
      21/09/28 09:54:16 WARN db.TextSplitter: You are strongly encouraged to choose an integral split column.
      21/09/28 09:54:16 INFO mapreduce.JobSubmitter: number of splits:4
      21/09/28 09:54:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1630059382070_0035
      21/09/28 09:54:16 INFO mapreduce.JobSubmitter: Executing with tokens: []
      21/09/28 09:54:16 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.0.1.0-187/0/resource-types.xml
      21/09/28 09:54:16 INFO impl.YarnClientImpl: Submitted application application_1630059382070_0035
      21/09/28 09:54:16 INFO mapreduce.Job: The url to track the job: http://itisbdp:8088/proxy/application_1630059382070_0035/
      21/09/28 09:54:16 INFO mapreduce.Job: Running job: job_1630059382070_0035
      21/09/28 09:54:21 INFO mapreduce.Job: Job job_1630059382070_0035 running in uber mode : false
      21/09/28 09:54:21 INFO mapreduce.Job: map 0% reduce 0%
      21/09/28 09:54:25 INFO mapreduce.Job: Task Id : attempt_1630059382070_0035_m_000001_0, Status : FAILED
      Error: java.io.IOException: SQLException in nextKeyValue
      at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:275)
      at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:568)
      at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
      at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
      at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
      at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
      Caused by: java.sql.SQLException: JDBC-8022:Invalid end of SQL.
      at line 1, column 1430 of null:
      BLED FROM DMSDBA.CMM_CADORG_TB AS DMSDBA.CMM_CADORG_TB WHERE ( DORG_LATITUDE >=
      ^
      at com.tmax.tibero.jdbc.err.TbError.makeSQLException(Unknown Source)
      at com.tmax.tibero.jdbc.err.TbError.newSQLException(Unknown Source)
      at com.tmax.tibero.jdbc.msg.common.TbMsgError.readErrorStackInfo(Unknown Source)
      at com.tmax.tibero.jdbc.msg.TbMsgEreply.deserialize(Unknown Source)
      at com.tmax.tibero.jdbc.comm.TbStream.readMsg(Unknown Source)
      at com.tmax.tibero.jdbc.comm.TbCommType4.prepareExecute(Unknown Source)
      at com.tmax.tibero.jdbc.driver.TbPreparedStatementImpl.executeCompleteSQL(Unknown Source)
      at com.tmax.tibero.jdbc.driver.TbPreparedStatementImpl.executeInternal(Unknown Source)
      at com.tmax.tibero.jdbc.driver.TbPreparedStatementImpl.executeQuery(Unknown Source)
      at com.tmax.tibero.jdbc.driver.TbPreparedStatement.executeQuery(Unknown Source)
      at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:109)
      at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:233)
      ... 12 more

      21/09/28 09:54:25 INFO mapreduce.Job: Task Id : attempt_1630059382070_0035_m_000003_0, Status : FAILED
      Error: java.io.IOException: SQLException in nextKeyValue
      at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:275)
      at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:568)
      at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
      at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
      at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
      at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

      ```

      Attachments

        Activity

          People

            00ber Sushant Karki
            iamravide Ravi Kumar
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:

              Time Tracking

                Estimated:
                Original Estimate - 12h
                12h
                Remaining:
                Remaining Estimate - 12h
                12h
                Logged:
                Time Spent - Not Specified
                Not Specified