Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-15543

Don't try to get memory/cores to decide parallelism when Spark dynamic allocation is enabled

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.2.0
    • Fix Version/s: 2.3.0
    • Component/s: Spark
    • Labels:
      None

      Description

      Presently Hive tries to get numbers for memory and cores from the Spark application and use them to determine RS parallelism. However, this doesn't make sense when Spark dynamic allocation is enabled because the current numbers doesn't represent available computing resources, especially when SparkContext is initially launched.

      Thus, it makes send not to do that when dynamic allocation is enabled.

        Attachments

        1. HIVE-15543.patch
          4 kB
          Xuefu Zhang

          Activity

            People

            • Assignee:
              xuefuz Xuefu Zhang
              Reporter:
              xuefuz Xuefu Zhang
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: