Description
We calculate the number of reducers based on the same code for MapReduce. However, reducers are vastly cheaper in Spark and it's generally recommended we have many more reducers than in MR.
Sandy Ryza who works on Spark has some ideas about a heuristic.