Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
3.0.0
-
None
-
None
-
Spark worker running inside a Kubernetes pod with a Bitnami Spark image, and the driver running inside of a Jupyter Spark Kubernetes pod.
Description
When Spark is run in local mode, it works as expected. However, when Spark is run in client mode, it copies the jars to the executor ($SPARK_HOME/work/<app id>/<executor id>), but never adds them to the classpath.
It might be worth noting that `spark.jars` does add the jars to the classpath, but unlike `spark.jars.packages` it doesn't automatically download the jar's compile dependencies.
```
spark = SparkSession.builder\
.master(SPARK_MASTER)\
.appName(APP_NAME)\
...
.config("spark.jars.packages", DEPENDENCY_PACKAGES) \
...
.getOrCreate()
```
Attachments
Issue Links
- relates to
-
SPARK-35084 [k8s] On Spark 3, jars listed in spark.jars and spark.jars.packages are not added to sparkContext
- Resolved