Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25590

kubernetes-model-2.0.0.jar masks default Spark logging config

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Duplicate
    • 2.4.0
    • None
    • Kubernetes, Spark Core
    • None

    Description

      That jar file, which is packaged when the k8s profile is enabled, has a log4j configuration embedded in it:

      $ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j
      log4j.properties
      

      What this causes is that Spark will always use that log4j configuration instead of its own default (log4j-defaults.properties), unless the user overrides it by somehow adding their own in the classpath before the kubernetes one.

      You can see that by running spark-shell. With the k8s jar in:

      $ ./bin/spark-shell 
      ...
      Setting default log level to "WARN"
      

      Removing the k8s jar:

      $ ./bin/spark-shell 
      ...
      Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
      Setting default log level to "WARN".
      

      The proper fix would be for the k8s jar to not ship that file, and then just upgrade the dependency in Spark, but if there's something easy we can do in the meantime...

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              vanzin Marcelo Masiero Vanzin
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: