Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2972

APPLICATION_COMPLETE not created in Python unless context explicitly stopped

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Won't Fix
    • 1.0.2
    • None
    • PySpark
    • None
    • Cloudera 5.1, yarn master on ubuntu precise

    Description

      If you don't explicitly stop a SparkContext at the end of a Python application with sc.stop(), an APPLICATION_COMPLETE file isn't created and the job doesn't get picked up by the history server.

      This can be easily reproduced with pyspark (but affects scripts as well).

      The current workaround is to wrap the entire script with a try/finally and stop manually.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              roji Shay Rojansky
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: