Uploaded image for project: 'Sqoop'
  1. Sqoop
  2. SQOOP-2567

SQOOP import for Oracle fails with invalid precision/scale for decimal

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.4.5
    • Fix Version/s: 1.5.0
    • Component/s: connectors
    • Labels:
    • Environment:

      CDH5.3

      Description

      Sqoop import fails creating avrodata file from the oracle source with decimal data. If the table in oracle is defined as say,
      Col1 as Decimal(12,11) , but if some data has few less digits in scale then it fails with the error as,

      Error: org.apache.avro.file.DataFileWriter$AppendWriteException: org.apache.avro.AvroTypeException: Cannot encode decimal with scale 10 as scale 11
      at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:296)
      at org.apache.sqoop.mapreduce.AvroOutputFormat$1.write(AvroOutputFormat.java:112)
      at org.apache.sqoop.mapreduce.AvroOutputFormat$1.write(AvroOutputFormat.java:108)
      at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655)
      at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
      at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
      at org.apache.sqoop.mapreduce.AvroImportMapper.map(AvroImportMapper.java:73)
      at org.apache.sqoop.mapreduce.AvroImportMapper.map(AvroImportMapper.java:39)
      at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
      at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:415)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
      Caused by: org.apache.avro.AvroTypeException: Cannot encode decimal with scale 10 as scale 11
      at org.apache.avro.Conversions$DecimalConversion.toBytes(Conversions.java:68)
      at org.apache.avro.Conversions$DecimalConversion.toBytes(Conversions.java:39)
      at org.apache.avro.generic.GenericDatumWriter.convert(GenericDatumWriter.java:90)
      at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:70)
      at org.apache.avro.reflect.ReflectDatumWriter.write(ReflectDatumWriter.java:143)
      at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:112)
      at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
      at org.apache.avro.reflect.ReflectDatumWriter.write(ReflectDatumWriter.java:143)
      at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:153)
      at org.apache.avro.reflect.ReflectDatumWriter.writeField(ReflectDatumWriter.java:175)

      also, when we dont have precision defined in Oracle ( which it takes default (38,0) i guess) it gives error as ,

      ERROR tool.ImportTool: Imported Failed: Invalid decimal precision: 0 (must be positive)

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                fero Fero Szabo
                Reporter:
                SureshDeoda Suresh Deoda
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: