Uploaded image for project: 'Sqoop (Retired)'
  1. Sqoop (Retired)
  2. SQOOP-3482

sqoop-import fails when importing records with datatype 'Money' in PostgreSQL

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 1.4.7
    • None
    • connectors/postgresql
    • None

    Description

      Issue description:
      We are trying to sqoop-import records from PostgreSQL server, which contain a fields datatype 'money'.
      During the course of import the Postgres JDBC driver is trying read this field as datatype 'double' and eventually it fails to import the record and throws the below exception -
      Bad value for type double : 100,000.00

      Steps to reproduce the behaviour:

      Environment details -
      Tried with Hadoop 2.7.1 and 3.1.4
      sqoop-version = 1.4.6 and 1.4.7
      Postgre JDBC driver = postgresql-42.2.16.jar
      Postgre SQL 12.4
      Linux OS - Ubuntu 20.04.1 LTS

      Database Name : stg_db
      Table Name : tbl_cust
      Columns and DataType :
      cust_f_name varchar(25),
      cust_l_name varchar(25),
      trans_amount money
      Table Contains 1 row as below -
      insert into tbl_cust values ('VR','Kumar',100000)

      Sqoop Statement (Connection string)
      sqoop-import --connect jdbc:postgresql://192.168.0.130:5432/stg_db
      --username postgres --password cornerstone
      --table tbl_cust -m 1 --target-dir tbl_cust --delete-target-dir;

      Below is the log message -
      20/09/08 14:24:23 INFO mapreduce.Job: Task Id : attempt_1599505663642_0011_m_000000_0, Status : FAILED
      Error: java.io.IOException: SQLException in nextKeyValue
      at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
      at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
      at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
      at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
      at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
      at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
      Caused by: org.postgresql.util.PSQLException: Bad value for type double : 100,000.00
      at org.postgresql.jdbc.PgResultSet.toDouble(PgResultSet.java:3104)
      at org.postgresql.jdbc.PgResultSet.getDouble(PgResultSet.java:2432)
      at org.apache.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:86)
      at com.cloudera.sqoop.lib.JdbcWritableBridge.readDouble(JdbcWritableBridge.java:69)
      at tabletodrop.readFields(tabletodrop.java:106)
      at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:244)
      ... 12 more

      However, other tables which doesn't have any field with datatype 'money' gets imported without any issues.
      Attached screenshot of the record in the postgresql table and the log details for reference.

      Attachments

        1. 1.JPG
          123 kB
          Vinodh Kumar R
        2. 2.JPG
          40 kB
          Vinodh Kumar R

        Activity

          People

            Unassigned Unassigned
            vinodhrk Vinodh Kumar R
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: