Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-15543

Don't try to get memory/cores to decide parallelism when Spark dynamic allocation is enabled

Log workAgile BoardRank to TopRank to BottomBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.2.0
    • 2.3.0
    • Spark
    • None

    Description

      Presently Hive tries to get numbers for memory and cores from the Spark application and use them to determine RS parallelism. However, this doesn't make sense when Spark dynamic allocation is enabled because the current numbers doesn't represent available computing resources, especially when SparkContext is initially launched.

      Thus, it makes send not to do that when dynamic allocation is enabled.

      Attachments

        1. HIVE-15543.patch
          4 kB
          Xuefu Zhang

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            xuefuz Xuefu Zhang Assign to me
            xuefuz Xuefu Zhang
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment