Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2003

SparkContext(SparkConf) doesn't work in pyspark

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Won't Fix
    • 1.0.0
    • 1.0.1, 1.1.0
    • Documentation, PySpark
    • None

    Description

      Using SparkConf with SparkContext as described in the Programming Guide does NOT work in Python:
      conf = SparkConf.setAppName("blah")
      sc = SparkContext(conf)
      When I tried I got
      AttributeError: 'SparkConf' object has no attribute '_get_object_id'

      [This equivalent code in Scala works fine:
      val conf = new SparkConf().setAppName("blah")
      val sc = new SparkContext(conf)]

      I think this is because there's no equivalent for the Scala constructor SparkContext(SparkConf).

      Workaround:
      If I explicitly set the conf parameter in the python call, it does work:
      sconf = SparkConf.setAppName("blah")
      sc = SparkContext(conf=sconf)

      Attachments

        Activity

          People

            Unassigned Unassigned
            dcarroll@cloudera.com Diana Carroll
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: