Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19301

SparkContext is ignoring SparkConf when _jvm is not initialized on spark-submit

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Duplicate
    • 2.1.0
    • None
    • PySpark
    • None

    Description

      When using spark-submit with code below

      SparkContext(conf=SparkConf().setAppName('foo'))
      

      SparkContext ignores conf arg.
      This bug is started by this commit.
      https://github.com/apache/spark/commit/5b77e66dd6a128c5992ab3bde418613f84be7009
      ignoring conf which not having _jconf
      https://github.com/apache/spark/blob/5b77e66dd6a128c5992ab3bde418613f84be7009/python/pyspark/context.py#L125

      To resolve this problem, you have to call SparkContext._ensure_initialized() before calling SparkConf().
      As I cannot find test code for the commit above,
      SparkContext initialization process was too complicated for me to write a patch to fix this problem.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              teppeidaito Teppei Daito
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: