Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7431

When run on spark cluster, some spark tasks may fail

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.1.0
    • Component/s: None
    • Labels:
      None

      Description

      When running queries on spark, some spark tasks fail (usually the first couple of tasks) with the following stack trace:

      org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:154)
      org.apache.hadoop.hive.ql.exec.spark.HiveMapFunction.call(HiveMapFunction.java:60)
      org.apache.hadoop.hive.ql.exec.spark.HiveMapFunction.call(HiveMapFunction.java:35)
      org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:161)
      org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:161)
      org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559)
      org.apache.spark.rdd.RDD$$anonfun$12.apply(RDD.scala:559)
      org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
      org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
      org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
      org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
      ...

      Observed for spark standalone cluster. Not verified for spark on yarn or mesos.
      NO PRECOMMIT TESTS. This is for spark branch only.

        Attachments

        1. HIVE-7431.1.patch
          3 kB
          Rui Li
        2. HIVE-7431.2.patch
          2 kB
          Rui Li

          Issue Links

            Activity

              People

              • Assignee:
                lirui Rui Li
                Reporter:
                lirui Rui Li
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: