Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.4.0
-
None
Description
Currently local dependencies are not supported with Spark on K8S i.e. if the user has code or dependencies only on the client where they run spark-submit then the current implementation has no way to make those visible to the Spark application running inside the K8S pods that get launched. This limits users to only running applications where the code and dependencies are either baked into the Docker images used or where those are available via some external and globally accessible file system e.g. HDFS which are not viable options for many users and environments
Attachments
Issue Links
- Blocked
-
SPARK-27936 Support local dependency uploading from --py-files
- Resolved
- is duplicated by
-
SPARK-26789 [k8s] pyspark needs to upload local resources to driver and executor pods
- Resolved
- is related to
-
SPARK-31726 Make spark.files available in driver with cluster deploy mode on kubernetes
- Open
- relates to
-
SPARK-35084 [k8s] On Spark 3, jars listed in spark.jars and spark.jars.packages are not added to sparkContext
- Resolved
- links to