Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-6462

NFS: fsstat request fails with the secure hdfs

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 2.2.0
    • 2.5.0
    • nfs
    • None
    • Reviewed

    Description

      Fsstat fails in secure environment with below error.

      Steps to reproduce:
      1) Create user named UserB and UserA
      2) Create group named GroupB
      3) Add root and UserB users to GroupB
      Make sure UserA is not in GroupB
      4) Set below properties

      ===================================
      hdfs-site.xml
      ===================================
       <property>
          <name>dfs.nfs.keytab.file</name>
          <value>/tmp/keytab/UserA.keytab</value>
        </property>
        <property>
          <name>dfs.nfs.kerberos.principal</name>
          <value>UserA@EXAMPLE.COM</value>
        </property>
      ==================================
      core-site.xml
      ==================================
      <property>
          <name>hadoop.proxyuser.UserA.groups</name>
         <value>GroupB</value>
       </property>
      <property>
         <name>hadoop.proxyuser.UserA.hosts</name>
         <value>*</value>
       </property>
      

      4) start nfs server as UserA
      5) mount nfs as root user
      6) run below command

      [root@host1 ~]# df /tmp/tmp_mnt/
      df: `/tmp/tmp_mnt/': Input/output error
      df: no file systems processed
      

      NFS Logs complains as below

      2014-05-29 00:09:13,698 DEBUG nfs3.RpcProgramNfs3 (RpcProgramNfs3.java:fsstat(1654)) - NFS FSSTAT fileId: 16385
      2014-05-29 00:09:13,706 WARN  ipc.Client (Client.java:run(672)) - Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
      2014-05-29 00:09:13,710 WARN  nfs3.RpcProgramNfs3 (RpcProgramNfs3.java:fsstat(1681)) - Exception
      java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "host1/0.0.0.0"; destination host is: "host1":8020;
              at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
              at org.apache.hadoop.ipc.Client.call(Client.java:1414)
              at org.apache.hadoop.ipc.Client.call(Client.java:1363)
              at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
              at com.sun.proxy.$Proxy14.getFsStats(Unknown Source)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:601)
              at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
              at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
              at com.sun.proxy.$Proxy14.getFsStats(Unknown Source)
              at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getStats(ClientNamenodeProtocolTranslatorPB.java:554)
              at org.apache.hadoop.hdfs.DFSClient.getDiskStatus(DFSClient.java:2165)
              at org.apache.hadoop.hdfs.nfs.nfs3.RpcProgramNfs3.fsstat(RpcProgramNfs3.java:1659)
              at org.apache.hadoop.hdfs.nfs.nfs3.RpcProgramNfs3.handleInternal(RpcProgramNfs3.java:1961)
              at org.apache.hadoop.oncrpc.RpcProgram.messageReceived(RpcProgram.java:162)
              at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
              at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:560)
              at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:787)
              at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:281)
              at org.apache.hadoop.oncrpc.RpcUtil$RpcMessageParserStage.messageReceived(RpcUtil.java:132)
              at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
              at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:560)
              at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:787)
              at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
              at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
              at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
              at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
              at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
              at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:560)
              at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:555)
              at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
              at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
              at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
              at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:107)
              at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
              at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:88)
              at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
              at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
              at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              at java.lang.Thread.run(Thread.java:722)
      Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
              at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:677)
              at java.security.AccessController.doPrivileged(Native Method)
              at javax.security.auth.Subject.doAs(Subject.java:415)
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
              at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:640)
              at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
              at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
              at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
              at org.apache.hadoop.ipc.Client.call(Client.java:1381)
              ... 42 more
      

      Attachments

        1. HDFS-6462.patch
          2 kB
          Brandon Li

        Activity

          People

            brandonli Brandon Li
            yeshavora Yesha Vora
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: