Description
Submitted application passing --py-files option in a hardcoded manner for a Mesos Cluster in cluster mode using REST Submission API. It is causing a simple Java-based SparkPi job to fail.
This Bug is introduced by SPARK-26466.
Here is the example job submission:
curl -X POST http://localhost:7077/v1/submissions/create --header "Content-Type:application/json" --data '{ "action": "CreateSubmissionRequest", "appResource": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "clientSparkVersion": "3.0.0", "appArgs": ["30"], "environmentVariables": {}, "mainClass": "org.apache.spark.examples.SparkPi", "sparkProperties": { "spark.jars": "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar", "spark.driver.supervise": "false", "spark.executor.memory": "512m", "spark.driver.memory": "512m", "spark.submit.deployMode": "cluster", "spark.app.name": "SparkPi", "spark.master": "mesos://localhost:5050" }}'
Expected Driver log would contain:
20/08/20 20:19:57 WARN DependencyUtils: Local jar /var/lib/mesos/slaves/e6779377-08ec-4765-9bfc-d27082fbcfa1-S0/frameworks/e6779377-08ec-4765-9bfc-d27082fbcfa1-0000/executors/driver-20200820201954-0002/runs/d9d734e8-a299-4d87-8f33-b134c65c422b/spark.driver.memory=512m does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 20/08/20 20:19:57 INFO ShutdownHookManager: Shutdown hook called