Details
-
Bug
-
Status: Resolved
-
Critical
-
Resolution: Duplicate
-
2.1.0
-
None
-
None
Description
When using spark-submit with code below
SparkContext(conf=SparkConf().setAppName('foo'))
SparkContext ignores conf arg.
This bug is started by this commit.
https://github.com/apache/spark/commit/5b77e66dd6a128c5992ab3bde418613f84be7009
ignoring conf which not having _jconf
https://github.com/apache/spark/blob/5b77e66dd6a128c5992ab3bde418613f84be7009/python/pyspark/context.py#L125
To resolve this problem, you have to call SparkContext._ensure_initialized() before calling SparkConf().
As I cannot find test code for the commit above,
SparkContext initialization process was too complicated for me to write a patch to fix this problem.
Attachments
Issue Links
- relates to
-
SPARK-19307 SPARK-17387 caused ignorance of conf object passed to SparkContext:
- Resolved