Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-32453

Remove SPARK_SCALA_VERSION environment and let load-spark-env scripts detect it in AppVeyor

    XMLWordPrintableJSON

Details

    • Test
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 3.1.0
    • None
    • Tests
    • None

    Description

      .\bin\spark-submit2.cmd --driver-java-options "-Dlog4j.configuration=file:///%CD:\=/%/R/log4j.properties" --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
      "Presence of build for multiple Scala versions detected ( and )."
      "Remove one of them or, set SPARK_SCALA_VERSION= in \spark-env.cmd."
      "Visit  for more details about setting environment variables in spark-env.cmd."
      "Either clean one of them or, set SPARK_SCALA_VERSION in spark-env.cmd."
      Command exited with code 1
      

      The load-spark-env script fails as below in AppVeyor. It was temporarily explicitly set but we should remove and detect it automatically in this script.

      Possibly related to SPARK-26132, SPARK-32227 and SPARK-32434

      Attachments

        Activity

          People

            Unassigned Unassigned
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: