Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25075 Build and test Spark against Scala 2.13
  3. SPARK-33048

Fix SparkBuild.scala to recognize build settings for Scala 2.13

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.0.1, 3.1.0
    • 3.1.0
    • Build
    • None

    Description

      In SparkBuild.scala, a variable 'scalaBinaryVersion' is hardcoded as '2.12'.
      So, an environment variable 'SPARK_SCALA_VERSION' is also to be '2.12'.
      This issue causes some test suites (e.g. SparkSubmitSuite) to be error.

      ===== TEST OUTPUT FOR o.a.s.deploy.SparkSubmitSuite: 'user classpath first in driver' =====
      
      20/10/02 08:55:30.234 redirect stderr for command /home/kou/work/oss/spark-scala-2.13/bin/spark-submit INFO Utils: Error: Could not find or load m
      ain class org.apache.spark.launcher.Main
      20/10/02 08:55:30.235 redirect stderr for command /home/kou/work/oss/spark-scala-2.13/bin/spark-submit INFO Utils: /home/kou/work/oss/spark-scala-
      2.13/bin/spark-class: line 96: CMD: bad array subscript
      

      The reason of this error is that environment variables 'SPARK_JARS_DIR' and 'LAUNCH_CLASSPATH' is defined in bin/spark-class as follows.

      SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/scala-$SPARK_SCALA_VERSION/jars"
      LAUNCH_CLASSPATH="${SPARK_HOME}/launcher/target/scala-$SPARK_SCALA_VERSION/classes:$LAUNCH_CLASSPATH"
      

      Attachments

        Activity

          People

            sarutak Kousuke Saruta
            sarutak Kousuke Saruta
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: