Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22962

Kubernetes app fails if local files are used

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.3.0
    • 2.3.0
    • Kubernetes, Spark Core
    • None

    Description

      If you try to start a Spark app on kubernetes using a local file as the app resource, for example, it will fail:

      ./bin/spark-submit [[bunch of arguments]] /path/to/local/file.jar
      
      + /sbin/tini -s -- /bin/sh -c 'SPARK_CLASSPATH="${SPARK_HOME}/jars/*" &&     env | grep SPARK_JAVA_OPT_ | sed '\''s/[^=]*=\(.*\)/\1/g'
      \'' > /tmp/java_opts.txt &&     readarray -t SPARK_DRIVER_JAVA_OPTS < /tmp/java_opts.txt &&     if ! [ -z ${SPARK_MOUNTED_CLASSPATH+x}
       ]; then SPARK_CLASSPATH="$SPARK_MOUNTED_CLASSPATH:$SPARK_CLASSPATH"; fi &&     if ! [ -z ${SPARK_SUBMIT_EXTRA_CLASSPATH+x} ]; then SP
      ARK_CLASSPATH="$SPARK_SUBMIT_EXTRA_CLASSPATH:$SPARK_CLASSPATH"; fi &&     if ! [ -z ${SPARK_MOUNTED_FILES_DIR+x} ]; then cp -R "$SPARK
      _MOUNTED_FILES_DIR/." .; fi &&     ${JAVA_HOME}/bin/java "${SPARK_DRIVER_JAVA_OPTS[@]}" -cp "$SPARK_CLASSPATH" -Xms$SPARK_DRIVER_MEMOR
      Y -Xmx$SPARK_DRIVER_MEMORY -Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS $SPARK_DRIVER_ARGS'
      Error: Could not find or load main class com.cloudera.spark.tests.Sleeper
      

      Using an http server to provide the app jar solves the problem.

      The k8s backend should either somehow make these files available to the cluster or error out with a more user-friendly message if that feature is not yet available.

      Attachments

        Activity

          People

            liyinan926 Yinan Li
            vanzin Marcelo Masiero Vanzin
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: