Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-23857

In mesos cluster mode spark submit requires the keytab to be available on the local file system.

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.3.0
    • Fix Version/s: 2.4.0
    • Component/s: Mesos
    • Labels:
      None

      Description

      Users could submit their jobs from an external to the cluster host which may not have the required keytab locally (also discussed here).  

      Moreover, in cluster mode it does not make much sense to reference a local resource unless this is uploaded/stored somewhere in the cluster. For yarn HDFS is used, on mesos and certainly on DC/OS right now the secret store is used for storing secrets and consequently keytabs. There is a check here that makes spark submit difficult to use in such deployment scenarios.

      On DC/OS the workaround is to directly submit to the mesos dispatcher rest api by passing the spark.yarn.tab property pointing to a path within the driver's container where the keytab will be mounted after its fetched from the secret store, at container's launch time. Target is to allow spark submit be flexible enough for mesos in cluster mode, as DC/OS users often want to deploy using that.

        Attachments

          Activity

            People

            • Assignee:
              skonto Stavros Kontopoulos
              Reporter:
              skonto Stavros Kontopoulos
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: