Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4180

SparkContext constructor should throw exception if another SparkContext is already running

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.2.0, 1.3.0
    • Component/s: Spark Core
    • Labels:
      None
    • Target Version/s:

      Description

      Spark does not currently support multiple concurrently-running SparkContexts in the same JVM (see SPARK-2243). Therefore, SparkContext's constructor should throw an exception if there is an active SparkContext that has not been shut down via stop().

      PySpark already does this, but the Scala SparkContext should do the same thing. The current behavior with multiple active contexts is unspecified / not understood and it may be the source of confusing errors (see the user error report in SPARK-4080, for example).

      This should be pretty easy to add: just add a activeSparkContext field to the SparkContext companion object and synchronize on it in the constructor and stop() methods; see PySpark's context.py file for an example of this approach.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                joshrosen Josh Rosen
                Reporter:
                joshrosen Josh Rosen
              • Votes:
                1 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: