This is a requirement from Hive on Spark, mapPartitionsWithContext only exists in Spark Scala API, we expect to access from Spark Java API.
For HIVE-7627, HIVE-7843, Hive operators which are invoked in mapPartitions closure need to get taskId.
HIVE-7627FSStatsPublisher does fit into Spark multi-thread task mode[Spark Branch]
Is contained by
SPARK-3543Write TaskContext in Java and expose it through a static accessor
is depended upon by
HIVE-7843orc_analyze.q fails due to random mapred.task.id in FileSinkOperator [Spark Branch]