Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
1.2.0
-
None
-
None
-
spark on yarn,carbondata1.2.0,hadoop2.7,spark2.1.0,hive2.1.0
Description
when I tried to load data into table (data size is about 300 million),the log showed me that “Data load is partially successful for table",
but when I executed delete table operation,some errors appeared,the error message is "java.lang.ArrayIndexOutOfBoundsException: 1
at org.apache.carbondata.core.mutate.CarbonUpdateUtil.getRequiredFieldFromTID(CarbonUpdateUtil.java:67)".
when I executed another delete table operation with where condition,it was succeeful,but executed select operation then appeared "java.lang.ArrayIndexOutOfBoundsException Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)"