Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19301

SparkContext is ignoring SparkConf when _jvm is not initialized on spark-submit

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Critical
    • Resolution: Duplicate
    • Affects Version/s: 2.1.0
    • Fix Version/s: None
    • Component/s: PySpark
    • Labels:
      None

      Description

      When using spark-submit with code below

      SparkContext(conf=SparkConf().setAppName('foo'))
      

      SparkContext ignores conf arg.
      This bug is started by this commit.
      https://github.com/apache/spark/commit/5b77e66dd6a128c5992ab3bde418613f84be7009
      ignoring conf which not having _jconf
      https://github.com/apache/spark/blob/5b77e66dd6a128c5992ab3bde418613f84be7009/python/pyspark/context.py#L125

      To resolve this problem, you have to call SparkContext._ensure_initialized() before calling SparkConf().
      As I cannot find test code for the commit above,
      SparkContext initialization process was too complicated for me to write a patch to fix this problem.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                teppeidaito Teppei Daito
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: