Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25934

Mesos: SPARK_CONF_DIR should not be propogated by spark submit

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.3.2
    • Fix Version/s: 2.3.3, 2.4.1, 3.0.0
    • Component/s: Mesos
    • Labels:
      None

      Description

      This is very similar to how SPARK_HOME caused problems for spark on Mesos in SPARK-12345

      The `spark submit` command is setting spark.mesos.driverEnv.SPARK_CONF_DIR to whatever the SPARK_CONF_DIR was for the command that submitted the job.

      This is doesn't make sense for most mesos situations, and it broke spark for my team when we upgraded from 2.2.0 to 2.3.2. I haven't tested it but I think 2.4.0 will have the same issue.

      It's preventing spark-env.sh from running because now SPARK_CONF_DIR points to some non-existent directory, instead of the unpacked spark binary in the Mesos sandbox like it should.

      I'm not that familiar with the spark code base, but I think this could be fixed by simply adding a `&& k != "SPARK_CONF_DIR"` clause to this filter statement: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala#L421

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                mmolek Matt Molek
                Reporter:
                mmolek Matt Molek
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: