Details
-
Question
-
Status: Resolved
-
Minor
-
Resolution: Invalid
-
2.3.2
-
None
-
None
Description
I'm trying to run spark on a kubernetes cluster in Azure. The idea is to store the Spark application jars and dependencies in a container in Azure Blob Storage.
I've tried to do this with a public container and this works OK, but when having a private Blob Storage container, the spark-init init container doesn't download the jars.
The equivalent in AWS S3 is as simple as adding the key_id and secret as environment variables, but I don't see how to do this for Azure Blob Storage.