Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-9993

Retrying task could use cached bad operators [Spark Branch]

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: spark-branch
    • Fix Version/s: 1.2.0
    • Component/s: None
    • Labels:
      None

      Description

      For a Spark task, it could be retried on the same executor in case some failures. In retrying, the cache task could be used. Since the operators in the task are already initialized, they won't be initialized again. The partial data in these operators could lead to wrong final results.

        Attachments

          Activity

            People

            • Assignee:
              jxiang Jimmy Xiang
              Reporter:
              jxiang Jimmy Xiang
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: