Details
-
Bug
-
Status: Resolved
-
Critical
-
Resolution: Fixed
-
None
-
None
Description
Right now we have these xWithContext methods and it's a bit awkward (for instance, we don't support accessing taskContext from a normal map or filter operation). I'd propose the following
1. Re-write TaskContext in Java - it's a simple class. It can still refer to the scala version of TaskMetrics.
2. Have a static method `TaskContext.get()` which will return the current in-scope TaskContext. Under the hood this uses a thread local variable similar to SparkEnv that the Executor sets.
3. Deprecate all of the existing xWithContext methods.
Attachments
Issue Links
- contains
-
SPARK-2895 Support mapPartitionsWithContext in Spark Java API
- Resolved
- is related to
-
SPARK-5549 Define TaskContext interface in Scala
- Resolved
- links to