Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2003

SparkContext(SparkConf) doesn't work in pyspark

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Won't Fix
    • Affects Version/s: 1.0.0
    • Fix Version/s: 1.0.1, 1.1.0
    • Component/s: Documentation, PySpark
    • Labels:
      None

      Description

      Using SparkConf with SparkContext as described in the Programming Guide does NOT work in Python:
      conf = SparkConf.setAppName("blah")
      sc = SparkContext(conf)
      When I tried I got
      AttributeError: 'SparkConf' object has no attribute '_get_object_id'

      [This equivalent code in Scala works fine:
      val conf = new SparkConf().setAppName("blah")
      val sc = new SparkContext(conf)]

      I think this is because there's no equivalent for the Scala constructor SparkContext(SparkConf).

      Workaround:
      If I explicitly set the conf parameter in the python call, it does work:
      sconf = SparkConf.setAppName("blah")
      sc = SparkContext(conf=sconf)

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              dcarroll@cloudera.com Diana Carroll
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: