Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25742

Is there a way to pass the Azure blob storage credentials to the spark for k8s init-container?

    XMLWordPrintableJSON

Details

    • Question
    • Status: Resolved
    • Minor
    • Resolution: Invalid
    • 2.3.2
    • None
    • Kubernetes, Spark Core
    • None

    Description

      I'm trying to run spark on a kubernetes cluster in Azure. The idea is to store the Spark application jars and dependencies in a container in Azure Blob Storage.

      I've tried to do this with a public container and this works OK, but when having a private Blob Storage container, the spark-init init container doesn't download the jars.

      The equivalent in AWS S3 is as simple as adding the key_id and secret as environment variables, but I don't see how to do this for Azure Blob Storage.

      Attachments

        Activity

          People

            Unassigned Unassigned
            oscar.bonilla Oscar Bonilla
            Votes:
            1 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: