Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
v1.2
-
None
Description
I hit into an error during cube build. The “Build Dimension Dictionary” step failed after 0.15 minutes.
java.lang.IllegalArgumentException: Too many digits for NumberDictionary: 635696813322678783. Expect 16 digits before decimal point at max.
at org.apache.kylin.dict.NumberDictionary$NumberBytesCodec.encodeNumber(NumberDictionary.java:78)
at org.apache.kylin.dict.NumberDictionaryBuilder.addValue(NumberDictionaryBuilder.java:37)
at org.apache.kylin.dict.TrieDictionaryBuilder.addValue(TrieDictionaryBuilder.java:83)
at org.apache.kylin.dict.DictionaryGenerator.buildNumberDict(DictionaryGenerator.java:170)
at org.apache.kylin.dict.DictionaryGenerator.buildDictionaryFromValueEnumerator(DictionaryGenerator.java:67)
at org.apache.kylin.dict.DictionaryGenerator.buildDictionary(DictionaryGenerator.java:101)
at org.apache.kylin.dict.DictionaryManager.buildDictionary(DictionaryManager.java:211)
at org.apache.kylin.cube.CubeManager.buildDictionary(CubeManager.java:166)
at org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:52)
at org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:41)
at org.apache.kylin.job.hadoop.dict.CreateDictionaryJob.run(CreateDictionaryJob.java:52)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.kylin.job.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:62)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:51)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:107)
at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:130)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
result code:2
Turns out one of my dimension table contains a bigint column, and the max value of that column is 9223372036854775807, which is the upper bound of bigint type. I guess they must have used this value to indicate a special case. But regardless, the NumberDictionary class should be able to handle bigint right?