Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24960

k8s: explicitly expose ports on driver container

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.2.0
    • Fix Version/s: 2.4.0
    • Component/s: Deploy, Kubernetes, Scheduler
    • Labels:
      None

      Description

      For the Kubernetes scheduler, the Driver Pod does not explicitly expose its ports. It is possible for a Kubernetes environment to be setup such that Pod ports are closed by default and must be opened explicitly in the Pod spec. In such an environment without this improvement the Driver Service will be unable to route requests (e.g. from the Executors) to the corresponding Driver Pod, which can be observed on the Executor side with this error message:

      Caused by: java.io.IOException: Failed to connect to org-apache-spark-examples-sparkpi-1519271450264-driver-svc.dev.svc.cluster.local:7078

       

      For posterity, this is a copy of the original issue filed in the now deprecated apache-spark-on-k8s repository.

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              adelbert.chang Adelbert Chang
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: