Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-14703

Spark uses SLF4J, but actually relies quite heavily on Log4J

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Not A Problem
    • 1.6.0
    • None
    • Spark Core, YARN
    • 1.6.0-cdh5.7.0, logback 1.1.3, yarn

    • Patch

    Description

      We've built a version of Hadoop CDH-5.7.0 in house with logback as the SLF4J provider, in order to send hadoop logs straight to logstash (to handle with logstash/elasticsearch), on top of our existing use of the logback backend.

      In trying to start spark-shell I discovered several points where the fact that we weren't quite using a real L4J caused the sc not to be created or the YARN module not to exist. There are many more places where we should probably be wrapping the logging more sensibly, but I have a basic patch that fixes some of the worst offenders (at least the ones that stop the sparkContext being created properly).

      I'm prepared to accept that this is not a good solution and there probably needs to be some sort of better wrapper, perhaps in the Logging.scala class which handles this properly.

      Attachments

        1. spark-logback.patch
          11 kB
          Matthew Byng-Maddick

        Activity

          People

            Unassigned Unassigned
            mbm Matthew Byng-Maddick
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: