-
Type:
Improvement
-
Status: Resolved
-
Priority:
Major
-
Resolution: Duplicate
-
Affects Version/s: None
-
Fix Version/s: None
-
Component/s: Spark Core
-
Labels:None
Since we don't support concurrently-running SparkContexts / StreamingContexts in the same JVM, we should throw an error / exception when users try to create a context when another one is already running.
- duplicates
-
SPARK-4180 SparkContext constructor should throw exception if another SparkContext is already running
-
- Resolved
-