Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-12649

Hive on Spark will resubmitted application when not enough resouces to launch yarn application master

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Resolved
    • 1.1.1, 1.2.1
    • 1.3.0, 2.0.0
    • None
    • None

    Description

      Hive on spark will estimate reducer number when the query is not set reduce number,which cause a application submit.The application will pending if the yarn queue's resources is insufficient.
      So there are more than one pending applications probably because
      there are more than one estimate call.The failure is soft, so it doesn't prevent subsequent processings. We can make that a hard failure

      That code is found in
      at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:112)
      at org.apache.hadoop.hive.ql.optimizer.spark.SetSparkReducerParallelism.process(SetSparkReducerParallelism.java:115)

      Attachments

        Issue Links

          Activity

            People

              xuefuz Xuefu Zhang
              JoyoungZhang@gmail.com JoneZhang
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: