Uploaded image for project: 'CarbonData'
  1. CarbonData
  2. CARBONDATA-3269

Range_column throwing ArrayIndexOutOfBoundsException when using KryoSerializer

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • None
    • 1.5.2
    • None
    • None

    Description

      Reproduce:

      For range_column feature,When we set "spark.serializer" to "org.apache.spark.serializer.KryoSerializer", data loading will throw ArrayIndexOutOfBoundsException.

      Excpetion:

      2019-01-25 13:00:19 ERROR DataLoadProcessorStepOnSpark$:367 - Data Loading failed for table carbon_range_column4
      java.lang.ArrayIndexOutOfBoundsException: 5
      at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
      at org.apache.spark.scheduler.Task.run(Task.scala:108)
      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      at java.lang.Thread.run(Thread.java:748)
      2019-01-25 13:00:19 ERROR TaskContextImpl:91 - Error in TaskFailureListener
      org.apache.carbondata.processing.loading.exception.CarbonDataLoadingException: Data Loading failed for table carbon_range_column4
      at org.apache.carbondata.spark.load.DataLoadProcessorStepOnSpark$.org$apache$carbondata$spark$load$DataLoadProcessorStepOnSpark$$wrapException(DataLoadProcessorStepOnSpark.scala:368)
      at org.apache.carbondata.spark.load.DataLoadProcessorStepOnSpark$$anonfun$convertFunc$3.apply(DataLoadProcessorStepOnSpark.scala:215)
      at org.apache.carbondata.spark.load.DataLoadProcessorStepOnSpark$$anonfun$convertFunc$3.apply(DataLoadProcessorStepOnSpark.scala:210)
      at org.apache.spark.TaskContext$$anon$2.onTaskFailure(TaskContext.scala:144)
      at org.apache.spark.TaskContextImpl$$anonfun$markTaskFailed$1.apply(TaskContextImpl.scala:107)
      at org.apache.spark.TaskContextImpl$$anonfun$markTaskFailed$1.apply(TaskContextImpl.scala:107)
      at org.apache.spark.TaskContextImpl$$anonfun$invokeListeners$1.apply(TaskContextImpl.scala:130)
      at org.apache.spark.TaskContextImpl$$anonfun$invokeListeners$1.apply(TaskContextImpl.scala:128)
      at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
      at org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:128)
      at org.apache.spark.TaskContextImpl.markTaskFailed(TaskContextImpl.scala:106)
      at org.apache.spark.scheduler.Task.run(Task.scala:113)
      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      at java.lang.Thread.run(Thread.java:748)
      Caused by: java.lang.ArrayIndexOutOfBoundsException: 5
      at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
      at org.apache.spark.scheduler.Task.run(Task.scala:108)
      ... 4 more

      Attachments

        Issue Links

          Activity

            People

              qiangcai David Cai
              qiangcai David Cai
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 1h 50m
                  1h 50m