Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
2.0.0
-
None
Description
When a column col1 defined as decimal and if the sum of the column overflows, we will try to increase the decimal precision by 10. But if it's reaching 38 (the max precision), the overflow still could happen. Right now, if such case happens, the following exception will throw since hive is writing incorrect data.
Follow the following steps to repro.
CREATE TABLE DECIMAL_PRECISION(dec decimal(38,18)); INSERT INTO DECIMAL_PRECISION VALUES(98765432109876543210.12345), (98765432109876543210.12345); SELECT SUM(dec) FROM DECIMAL_PRECISION;
Caused by: java.lang.ArrayIndexOutOfBoundsException: 1 at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314) ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:219) ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT] at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142) ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
Attachments
Attachments
Issue Links
- relates to
-
HIVE-6459 Change the precison/scale for intermediate sum result in the avg() udf
- Resolved