Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
None
-
None
-
None
Description
We have users that run a notebook cell that creates a new SparkContext to overwrite some of the default initial parameters:
if 'sc' in globals(): #Stop the running SparkContext if there is one running. sc.stop() conf = SparkConf().setAppName("app") #conf.set('spark.sql.shuffle.partitions', '2000') sc = SparkContext(conf=conf) sqlContext = HiveContext(sc)
In Spark 2.0, this creates an invalid SQLContext that uses the original SparkContext because the HiveContext contstructor uses SparkSession.getOrCreate that has the old SparkContext. A SparkSession should be invalidated and no longer returned by getOrCreate if its SparkContext has been stopped.
Attachments
Issue Links
- duplicates
-
SPARK-19055 SparkSession initialization will be associated with invalid SparkContext when new SparkContext is created to replace stopped SparkContext
- Resolved
- links to