Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-1644

HDFS Client initialize SecurityAudit.audit log file

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • None
    • 0.22.0
    • hdfs-client
    • None
    • Java 6, CentOS 5.5

    Description

      There is a hidden problem that hdfs client tries to initialize SecurityAudit.audit log file which it shouldn't. The problem can be surfaced by configuring HADOOP_LOG_DIR to a directory that is not writable by the running user.

      log4j:ERROR setFile(null,true) call failed.
      java.io.FileNotFoundException: /var/log/hadoop/nonexisted/SecurityAuth.audit (No such file or directory)
      	at java.io.FileOutputStream.openAppend(Native Method)
      	at java.io.FileOutputStream.<init>(FileOutputStream.java:177)
      	at java.io.FileOutputStream.<init>(FileOutputStream.java:102)
      	at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
      	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
      	at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
      	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
      	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
      	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:97)
      	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:689)
      	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
      	at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:568)
      	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:442)
      	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
      	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:471)
      	at org.apache.log4j.LogManager.<clinit>(LogManager.java:125)
      	at org.apache.log4j.Logger.getLogger(Logger.java:105)
      	at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:289)
      	at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:109)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
      	at org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1116)
      	at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:914)
      	at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:604)
      	at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:336)
      	at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:310)
      	at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:685)
      	at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:141)
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:57)
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
      	at org.apache.hadoop.fs.FsShell.main(FsShell.java:1895)
      log4j:ERROR Either File or DatePattern options are not set for appender [DRFAS].
      

      The log4j initialization should setup differently between HDFS server and HDFS client. This bug exists in branch-0.20-security branch.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              eyang Eric Yang
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: