Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-7750

DataNode: Cannot start secure cluster without privileged resources | tags/release-0.20.205.0-rc2

    XMLWordPrintableJSON

Details

    • release-0.20.205.0-rc2

    Description

      This tag compiles just fine. But after configuring it, the datanode fails on startup with the below error:

      TARTUP_MSG: Starting DataNode
      STARTUP_MSG: host = hd3w94m7/10.152.94.111
      STARTUP_MSG: args = []
      STARTUP_MSG: version = 0.20.205.1
      STARTUP_MSG: build = http://svn.apache.org/repos/asf/hadoop/common/tags/release-0.20.205.0-rc2 -r 1179942; compiled by 'tpowell1' on Wed Oct 12 11:14:46 PDT 2011
      ************************************************************/
      2011-10-14 15:24:56,028 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
      2011-10-14 15:24:56,043 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
      2011-10-14 15:24:56,044 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
      2011-10-14 15:24:56,044 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
      2011-10-14 15:24:56,192 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered.
      2011-10-14 15:24:56,421 INFO org.apache.hadoop.security.UserGroupInformation: Asked the TGT renewer thread to terminate
      2011-10-14 15:24:57,241 INFO org.apache.hadoop.security.UserGroupInformation: Login successful for user hdfs/hd3w94m7@XXX using keytab file /home/tpowell1/hadoop.tags.release-0.20.205.0-rc2/conf/hdfs.keytab
      2011-10-14 15:24:57,242 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.lang.RuntimeException: Cannot start secure cluster without privileged resources.
      at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:306)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:281)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1545)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1484)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1502)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1628)
      at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1645)

      2011-10-14 15:24:57,243 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
      /************************************************************
      SHUTDOWN_MSG: Shutting down DataNode at hd3w94m7.XXX/10.152.94.111
      ************************************************************/

      Checking the Datanode.java code it is started with a null SecureResources .

      public static void main(String args[])

      { secureMain(args, null); }

      This null resource seems to get passed all the way down to startDataNode() where there is a null check... which in turns throws the error we see.

      void startDataNode(Configuration conf,
      AbstractList<File> dataDirs, SecureResources resources
      ) throws IOException {
      if(UserGroupInformation.isSecurityEnabled() && resources == null)
      throw new RuntimeException("Cannot start secure cluster without " +
      "privileged resources.");

      Attachments

        Activity

          People

            Unassigned Unassigned
            catdaaaady Trevor Powell
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: