Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
1.0.0
-
None
-
None
Description
in writing my own RDD i ran into a few issues with respect to stuff being private in spark.
in compute i would like to return an iterator that respects task killing (as HadoopRDD does), but the mechanics for that are inside the private InterruptibleIterator. also the exception i am supposed to throw (TaskKilledException) is private to spark.
See also:
http://apache-spark-user-list.1001560.n3.nabble.com/Re-writing-my-own-RDD-td5558.html