Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-19439

MapWork shouldn't be reused when Spark task fails during initialization

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • Spark
    • None

    Description

      Issue identified in HIVE-19388. When a Spark task fails during initializing the map operator, the task is retried with the same MapWork retrieved from cache. This can be problematic because the MapWork may be partially initialized, e.g. some operators are already in INIT state.

      Attachments

        Activity

          People

            Unassigned Unassigned
            lirui Rui Li
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated: