Spark does not currently support multiple concurrently-running SparkContexts in the same JVM (see
SPARK-2243). Therefore, SparkContext's constructor should throw an exception if there is an active SparkContext that has not been shut down via stop().
PySpark already does this, but the Scala SparkContext should do the same thing. The current behavior with multiple active contexts is unspecified / not understood and it may be the source of confusing errors (see the user error report in
SPARK-4080, for example).
This should be pretty easy to add: just add a activeSparkContext field to the SparkContext companion object and synchronize on it in the constructor and stop() methods; see PySpark's context.py file for an example of this approach.