Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-7292 Hive on Spark
  3. HIVE-8649

Increase level of parallelism in reduce phase [Spark Branch]

Log workAgile BoardRank to TopRank to BottomBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersConvert to IssueMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.1.0
    • Spark
    • None

    Description

      We calculate the number of reducers based on the same code for MapReduce. However, reducers are vastly cheaper in Spark and it's generally recommended we have many more reducers than in MR.

      Sandy Ryza who works on Spark has some ideas about a heuristic.

      Attachments

        1. HIVE-8649.2-spark.patch
          6 kB
          Jimmy Xiang
        2. HIVE-8649.1-spark.patch
          5 kB
          Jimmy Xiang

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            jxiang Jimmy Xiang Assign to me
            brocknoland Brock Noland
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment