Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
#Wed Jul 09 10:28:32 PDT 2014
git.commit.id.abbrev=810a204
The way we determine how many significant digits after decimal point during decimal operation seems confusion, and in some cases, just doesn't make sense. Here is the experiments I did:
0: jdbc:drill:schema=dfs> select cast('12.3456' as decimal(18,6)) / cast('2.0' as decimal(38,1)) from data limit 1;
------------
EXPR$0 |
------------
6.1728000000000000000000000 |
------------
1 row selected (0.163 seconds)
0: jdbc:drill:schema=dfs> select cast('12.3456' as decimal(18,6)) / cast('2.0' as decimal(38,10)) from data limit 1;
------------
EXPR$0 |
------------
6.1728000000000000 |
------------
1 row selected (0.173 seconds)
0: jdbc:drill:schema=dfs> select cast('12.3456' as decimal(18,6)) / cast('2.0' as decimal(38,20)) from data limit 1;
------------
EXPR$0 |
------------
6.172800 |
------------
1 row selected (0.136 seconds)
0: jdbc:drill:schema=dfs> select cast('12.3456' as decimal(18,6)) / cast('2.0' as decimal(38,30)) from data limit 1;
------------
EXPR$0 |
------------
6 |
------------
1 row selected (0.185 seconds)
Look at the last example, no decimal digits are displayed, even though I increased decimal scale.