Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7799

TRANSFORM failed in transform_ppr1.q[Spark Branch]

Log workAgile BoardRank to TopRank to BottomVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.1.0
    • Spark

    Description

      Here is the exception:

      2014-08-20 01:14:36,594 ERROR executor.Executor (Logging.scala:logError(96)) - Exception in task 0.0 in stage 1.0 (TID 0)
      java.lang.NullPointerException
              at org.apache.hadoop.hive.ql.exec.spark.HiveKVResultCache.next(HiveKVResultCache.java:113)
              at org.apache.hadoop.hive.ql.exec.spark.HiveBaseFunctionResultList$ResultIterator.next(HiveBaseFunctionResultList.java:124)
              at org.apache.hadoop.hive.ql.exec.spark.HiveBaseFunctionResultList$ResultIterator.next(HiveBaseFunctionResultList.java:82)
              at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:42)
              at scala.collection.Iterator$class.foreach(Iterator.scala:727)
              at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
              at org.apache.spark.shuffle.hash.HashShuffleWriter.write(HashShuffleWriter.scala:65)
              at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
              at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
              at org.apache.spark.scheduler.Task.run(Task.scala:54)
              at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:199)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              at java.lang.Thread.run(Thread.java:722)
      

      Basically, the cause is that RowContainer is misused(it's not allowed to write once someone read row from it), i'm trying to figure out whether it's a hive issue or just in hive on spark mode.

      Attachments

        1. HIVE-7799.3-spark.patch
          1.0 kB
          Chengxiang Li
        2. HIVE-7799.2-spark.patch
          3 kB
          Chengxiang Li
        3. HIVE-7799.1-spark.patch
          1.0 kB
          Chengxiang Li

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            chengxiang li Chengxiang Li Assign to me
            chengxiang li Chengxiang Li
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Issue deployment