Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-38808

Windows, Spark 3.2.1: spark-shell command throwing this error: SparkContext: Error initializing SparkContext

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 3.2.1
    • None
    • Spark Shell
    • None

    Description

      Can't start spark-shell on Windows for Spark 3.2.1. Downgrading Spark version to 3.1.3 fixed the problem. I need Pandas API on Spark, which is available on >3.2, so it's not a solution for me.

      The bug and the workaround are described in the link below as well

      https://stackoverflow.com/questions/69923603/spark-shell-command-throwing-this-error-sparkcontext-error-initializing-sparkc

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned Assign to me
            Dudykevych Taras
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment