Description
We do not need duplicated logic to configure reducers number in SparkTask, as SetSparkReduceParallelism would always set reducers number in compiler phase.
Attachments
Attachments
Issue Links
- is part of
-
HIVE-7292 Hive on Spark
- Resolved