Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25065

Driver and executors pick the wrong logging configuration file.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: In Progress
    • Major
    • Resolution: Unresolved
    • 2.4.4, 3.0.0
    • None
    • Kubernetes, Spark Core
    • None

    Description

      Currently, when running in kubernetes mode, it sets necessary configuration properties by creating a spark.properties file and mounting a conf dir.

      The shipped Dockerfile, do not copy conf to the image, and this is on purpose and that is well understood. However, one would like to have his custom logging configuration file in the image conf directory.

      In order to achieve this, it is not enough to copy it in the spark's conf dir of the resultant image, as it is reset during kubernetes mount conf volume step.

       

      In order to reproduce, please add 

      -Dlog4j.debug

      to 

      spark.(executor|driver).extraJavaOptions

      . This way, it was found the provided log4j file is not picked and the one coming from kubernetes client jar was picked up by the driver process.
       

      Attachments

        Activity

          People

            Unassigned Unassigned
            prashant Prashant Sharma
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated: