Uploaded image for project: 'Pig'
  1. Pig
  2. PIG-1734

Pig needs a more efficient DAG execution

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      The current code uses Hadoop's Job control to execute one stage at a time. The first stage includes all jobs with no dependencies, the second stage jobs that depend only on jobs completed in the first stage, the third stage contains the jobs that depend on jobs from stage 1 and 2, etc.

      The problem with this simplistic approach is that each next stages only starts when the previous stage is over which means means that some branches of the DAG are unnecessarily blocked.

      We would need to do our own DAG management to solve this issue which would be a pretty significant undertaking. Something we should look at in the future.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                olgan Olga Natkovich
              • Votes:
                1 Vote for this issue
                Watchers:
                17 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: