Description
This is very similar to how SPARK_HOME caused problems for spark on Mesos in SPARK-12345
The `spark submit` command is setting spark.mesos.driverEnv.SPARK_CONF_DIR to whatever the SPARK_CONF_DIR was for the command that submitted the job.
This is doesn't make sense for most mesos situations, and it broke spark for my team when we upgraded from 2.2.0 to 2.3.2. I haven't tested it but I think 2.4.0 will have the same issue.
It's preventing spark-env.sh from running because now SPARK_CONF_DIR points to some non-existent directory, instead of the unpacked spark binary in the Mesos sandbox like it should.
I'm not that familiar with the spark code base, but I think this could be fixed by simply adding a `&& k != "SPARK_CONF_DIR"` clause to this filter statement: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala#L421
Attachments
Issue Links
- is caused by
-
SPARK-22466 SPARK_CONF_DIR is not is set by Spark's launch scripts with default value
- Resolved
- relates to
-
SPARK-12345 Mesos cluster mode is broken
- Resolved
- links to