Description
ConversionTreeReaders does not seem to handle Decimal64 column vectors. Hive creates VRB with Decimal64ColumnVectors when possible which gets passed down to ConversionTreeReaders which results in following ClassCastException
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.ql.exec.vector.Decimal64ColumnVector cannot be cast to org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector at org.apache.orc.impl.ConvertTreeReaderFactory$DecimalFromAnyIntegerTreeReader.nextVector(ConvertTreeReaderFactory.java:1190) ~[orc-core-1.5.1.jar:1.5.1] at org.apache.orc.impl.TreeReaderFactory$StructTreeReader.nextBatch(TreeReaderFactory.java:2012) ~[orc-core-1.5.1.jar:1.5.1] at org.apache.orc.impl.RecordReaderImpl.nextBatch(RecordReaderImpl.java:1276) ~[orc-core-1.5.1.jar:1.5.1] at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.ensureBatch(RecordReaderImpl.java:88) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.hasNext(RecordReaderImpl.java:104) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:253) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:228) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:360) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:167) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:52) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:116) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:229) ~[hive-shims-common-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:142) ~[hive-shims-common-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:205) ~[hadoop-mapreduce-client-core-3.1.0.jar:?] at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:191) ~[hadoop-mapreduce-client-core-3.1.0.jar:?] at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52) ~[hadoop-mapreduce-client-core-3.1.0.jar:?] at org.apache.hadoop.hive.ql.exec.mr.ExecMapRunner.run(ExecMapRunner.java:37) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:465) ~[hadoop-mapreduce-client-core-3.1.0.jar:?] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349) ~[hadoop-mapreduce-client-core-3.1.0.jar:?] at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:271) ~[hadoop-mapreduce-client-common-3.1.0.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_121] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_121] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_121] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_121] at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_121]
Attachments
Issue Links
- is required by
-
HIVE-19792 Upgrade orc to 1.5.2 and enable decimal_64 schema evolution tests
- Closed
- links to