That jar file, which is packaged when the k8s profile is enabled, has a log4j configuration embedded in it:
What this causes is that Spark will always use that log4j configuration instead of its own default (log4j-defaults.properties), unless the user overrides it by somehow adding their own in the classpath before the kubernetes one.
You can see that by running spark-shell. With the k8s jar in:
Removing the k8s jar:
The proper fix would be for the k8s jar to not ship that file, and then just upgrade the dependency in Spark, but if there's something easy we can do in the meantime...