Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
spark-branch
-
None
-
None
Description
For a Spark task, it could be retried on the same executor in case some failures. In retrying, the cache task could be used. Since the operators in the task are already initialized, they won't be initialized again. The partial data in these operators could lead to wrong final results.