Details
-
Test
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
3.1.0
-
None
-
None
Description
.\bin\spark-submit2.cmd --driver-java-options "-Dlog4j.configuration=file:///%CD:\=/%/R/log4j.properties" --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R "Presence of build for multiple Scala versions detected ( and )." "Remove one of them or, set SPARK_SCALA_VERSION= in \spark-env.cmd." "Visit for more details about setting environment variables in spark-env.cmd." "Either clean one of them or, set SPARK_SCALA_VERSION in spark-env.cmd." Command exited with code 1
The load-spark-env script fails as below in AppVeyor. It was temporarily explicitly set but we should remove and detect it automatically in this script.
Possibly related to SPARK-26132, SPARK-32227 and SPARK-32434