Uploaded image for project: 'Sqoop'
  1. Sqoop
  2. SQOOP-2974

datas from oracle import to hive,when the file format is parquet,error

    Details

      Description

      use sqoop-1.99.6 from oracle import data to hive-1.2.1,when "file format" select 2(parquet),Prompt the following error message:
      2016-06-24 10:25:55,530 DEBUG [communication thread] org.apache.hadoop.ipc.RPC: Call: ping 10
      2016-06-24 10:25:57,236 ERROR [OutputFormatLoader-consumer] org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor: Error while loading data out of MR job.
      java.lang.NoSuchMethodError: parquet.bytes.CapacityByteArrayOutputStream.<init>(II)V
      at parquet.column.values.rle.RunLengthBitPackingHybridEncoder.<init>(RunLengthBitPackingHybridEncoder.java:128)
      at parquet.column.values.rle.RunLengthBitPackingHybridValuesWriter.<init>(RunLengthBitPackingHybridValuesWriter.java:36)
      at parquet.column.ParquetProperties.getColumnDescriptorValuesWriter(ParquetProperties.java:88)
      at parquet.column.impl.ColumnWriterV1.<init>(ColumnWriterV1.java:81)
      at parquet.column.impl.ColumnWriteStoreV1.newMemColumn(ColumnWriteStoreV1.java:68)
      at parquet.column.impl.ColumnWriteStoreV1.getColumnWriter(ColumnWriteStoreV1.java:56)
      at parquet.io.MessageColumnIO$MessageColumnIORecordConsumer.<init>(MessageColumnIO.java:183)
      at parquet.io.MessageColumnIO.getRecordWriter(MessageColumnIO.java:375)
      at parquet.hadoop.InternalParquetRecordWriter.initStore(InternalParquetRecordWriter.java:109)
      at parquet.hadoop.InternalParquetRecordWriter.<init>(InternalParquetRecordWriter.java:99)
      at parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:217)
      at parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:175)
      at parquet.avro.AvroParquetWriter.<init>(AvroParquetWriter.java:93)
      at org.kitesdk.data.spi.filesystem.ParquetAppender.open(ParquetAppender.java:66)
      at org.kitesdk.data.spi.filesystem.FileSystemWriter.initialize(FileSystemWriter.java:135)
      at org.kitesdk.data.spi.filesystem.FileSystemView.newWriter(FileSystemView.java:101)
      at org.kitesdk.data.spi.AbstractDataset.newWriter(AbstractDataset.java:58)
      at org.apache.sqoop.connector.kite.KiteDatasetExecutor.getOrNewWriter(KiteDatasetExecutor.java:91)
      at org.apache.sqoop.connector.kite.KiteDatasetExecutor.writeRecord(KiteDatasetExecutor.java:86)
      at org.apache.sqoop.connector.kite.KiteLoader.load(KiteLoader.java:72)
      at org.apache.sqoop.connector.kite.KiteLoader.load(KiteLoader.java:36)
      at org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread.run(SqoopOutputFormatLoadExecutor.java:250)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
      2016-06-24 10:25:57,276 INFO [main] org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor: SqoopOutputFormatLoadExecutor::SqoopRecordWriter is about to be closed

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              andre_xuxu xushiyong
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated: