Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
2.3.2
-
None
-
None
-
Windows 10, running spark 2.3.2
Description
I have multiple versions of spark on my computer, and in particular SPARK_HOME set to a spark 2.0.2 installation.
If I browse to the bin directory of my spark 2.3.2 installation and run spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. It seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows (verified in spark 2.0.2 and spark 2.2.0):
`set SPARK_HOME=%~dp0..`
However this is not present in spark 2.3.2, instead calling find-spark-home.cmd which appears to be incorrectly assuming to take the environment variable if it exists.