Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17907

Not allowing more spark console

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 2.0.0
    • None
    • PySpark
    • None

    Description

      We are exploring pyspark and spark cluster, We are able to initiated single spark console connection, while trying to establish new connection. We are getting error.

      ---------------------------------------------------------------------------
      ValueError Traceback (most recent call last)
      <ipython-input-15-05f9533b85b9> in <module>()
      4 .set("spark.executor.memory", "1g")
      5 .set("spark.cores.max","1").set("spark.driver.allowMultipleContexts", "true") )
      ----> 6 sc = SparkContext(conf = conf)

      /opt/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py in _init_(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
      110 """
      111 self._callsite = first_spark_call() or CallSite(None, None, None)
      --> 112 SparkContext._ensure_initialized(self, gateway=gateway)
      113 try:
      114 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,

      /opt/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py in _ensure_initialized(cls, instance, gateway)
      257 " created by %s at %s:%s "
      258 % (currentAppName, currentMaster,
      --> 259 callsite.function, callsite.file, callsite.linenum))
      260 else:
      261 SparkContext._active_spark_context = instance

      ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PYSPARK, master=spark://172.31.28.208:7077) created by _init_ at <ipython-input-2-c7c8de510121>:6

      Command We are using

      conf = (SparkConf()
              .setMaster("spark://172.31.28.208:7077")
              .setAppName("sankar")
              .set("spark.executor.memory", "1g")
              .set("spark.cores.max","1").set("spark.driver.allowMultipleContexts", "true") )
      sc = SparkContext(conf = conf)
      

      Attachments

        Activity

          People

            Unassigned Unassigned
            sankar.mittapally@creditvidya.com Sankar Mittapally
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: