Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Duplicate
-
2.4.0
-
None
-
None
Description
That jar file, which is packaged when the k8s profile is enabled, has a log4j configuration embedded in it:
$ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j log4j.properties
What this causes is that Spark will always use that log4j configuration instead of its own default (log4j-defaults.properties), unless the user overrides it by somehow adding their own in the classpath before the kubernetes one.
You can see that by running spark-shell. With the k8s jar in:
$ ./bin/spark-shell ... Setting default log level to "WARN"
Removing the k8s jar:
$ ./bin/spark-shell ... Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN".
The proper fix would be for the k8s jar to not ship that file, and then just upgrade the dependency in Spark, but if there's something easy we can do in the meantime...
Attachments
Issue Links
- duplicates
-
SPARK-26742 Bump Kubernetes Client Version to 4.1.2
- Resolved