Uploaded image for project: 'Apache Airflow'
  1. Apache Airflow
  2. AIRFLOW-695

Retries do not execute because dagrun is in FAILED state

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: DagRun

      Description

      Currently on the latest master commit (15ff540ecd5e60e7ce080177ea3ea227582a4672), running on the LocalExecutor, retries on tasks do not execute because the state of the corresponding dagrun changes to FAILED. The task instance then gets blocked because "Task instance's dagrun was not in the 'running' state but in the state 'failed'," the error message produced by the following lines: https://github.com/apache/incubator-airflow/blob/master/airflow/ti_deps/deps/dagrun_exists_dep.py#L48-L50

      This error can be reproduced with the following simple DAG:

      DAG.py
              dag = models.DAG(dag_id='test_retry_handling')
              task = BashOperator(
                  task_id='test_retry_handling_op',
                  bash_command='exit 1',
                  retries=1,
                  retry_delay=datetime.timedelta(minutes=1),
                  dag=dag,
                  owner='airflow',
                  start_date=datetime.datetime(2016, 2, 1, 0, 0, 0))
      

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              harveyxia Harvey Xia
            • Votes:
              0 Vote for this issue
              Watchers:
              8 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: