Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25651

spark-shell gets wrong version of spark on windows

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 2.3.2
    • None
    • Spark Shell
    • None
    • Windows 10, running spark 2.3.2

    Description

      I have multiple versions of spark on my computer, and in particular SPARK_HOME set to a spark 2.0.2 installation.

      If I browse to the bin directory of my spark 2.3.2 installation and run spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. It seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows (verified in spark 2.0.2 and spark 2.2.0):

      `set SPARK_HOME=%~dp0..`

      However this is not present in spark 2.3.2, instead calling find-spark-home.cmd which appears to be incorrectly assuming to take the environment variable if it exists.

      Attachments

        Activity

          People

            Unassigned Unassigned
            nicksutcliffe Nick Sutcliffe
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: