Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-38808

Windows, Spark 3.2.1: spark-shell command throwing this error: SparkContext: Error initializing SparkContext

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 3.2.1
    • None
    • Spark Shell
    • None

    Description

      Can't start spark-shell on Windows for Spark 3.2.1. Downgrading Spark version to 3.1.3 fixed the problem. I need Pandas API on Spark, which is available on >3.2, so it's not a solution for me.

      The bug and the workaround are described in the link below as well

      https://stackoverflow.com/questions/69923603/spark-shell-command-throwing-this-error-sparkcontext-error-initializing-sparkc

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              Dudykevych Taras
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: