Uploaded image for project: 'CarbonData'
  1. CarbonData
  2. CARBONDATA-1290

[branch-1.1] delete problem

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • None
    • None

    Description

      1.max function is not return the right result;
      scala> cc.sql("select * from qqdata2.fullappend where id=19999999").show(false)
      ---------------------------------------------------------------------------------+

      id qqnum nick age gender auth qunnum mvcc

      ---------------------------------------------------------------------------------+

      19999999 19999999aaaaaaaa19999999 2009-05-27 19999999c19999999 1 1 19999999dddddd19999999 1

      ---------------------------------------------------------------------------------+

      scala> cc.sql("select max(id) from qqdata2.fullappend ").show(false)
      -------

      max(id)

      -------

      9999999

      -------

      2.delete error
      scala> cc.sql("delete from qqdata2.fullappend where id>1 and id<100000").show
      17/07/11 17:32:33 AUDIT ProjectForDeleteCommand:[Thread-1] Delete data request has been received for qqdata2.fullappend.
      [Stage 21:> (0 + 2) / 2]17/07/11 17:32:52 WARN TaskSetManager: Lost task 1.0 in stage 21.0 (TID 40, executor 2): java.lang.ArrayIndexOutOfBoundsException: 1
      at org.apache.carbondata.core.mutate.CarbonUpdateUtil.getRequiredFieldFromTID(CarbonUpdateUtil.java:67)
      at org.apache.carbondata.core.mutate.CarbonUpdateUtil.getSegmentWithBlockFromTID(CarbonUpdateUtil.java:76)
      at org.apache.spark.sql.execution.command.deleteExecution$$anonfun$4.apply(IUDCommands.scala:555)
      at org.apache.spark.sql.execution.command.deleteExecution$$anonfun$4.apply(IUDCommands.scala:552)
      at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
      at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
      at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
      at org.apache.spark.scheduler.Task.run(Task.scala:99)
      at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)

      Attachments

        Activity

          People

            Unassigned Unassigned
            Sehriff [FFCS研究院]
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: