Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-4841

FsShell commands using secure webhfds fail ClientFinalizer shutdown hook

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 3.0.0-alpha1
    • 2.1.0-beta
    • security, webhdfs
    • None
    • Reviewed

    Description

      Hadoop version:

      bash-4.1$ $HADOOP_HOME/bin/hadoop version
      Hadoop 3.0.0-SNAPSHOT
      Subversion git://github.com/apache/hadoop-common.git -r d5373b9c550a355d4e91330ba7cc8f4c7c3aac51
      Compiled by root on 2013-05-22T08:06Z
      From source with checksum 8c4cc9b1e8d6e8361431e00f64483f
      This command was run using /var/lib/hadoop-hdfs/hadoop-3.0.0-SNAPSHOT/share/hadoop/common/hadoop-common-3.0.0-SNAPSHOT.jar
      

      I'm seeing a problem when issuing FsShell commands using the webhdfs:// URI when security is enabled. The command completes but leaves a warning that ShutdownHook 'ClientFinalizer' failed.

      bash-4.1$ hadoop-3.0.0-SNAPSHOT/bin/hadoop fs -ls webhdfs://hdfs-upgrade-pseudo.ent.cloudera.com:50070/
      2013-05-22 09:46:55,710 INFO  [main] util.Shell (Shell.java:isSetsidSupported(311)) - setsid exited with exit code 0
      Found 3 items
      drwxr-xr-x   - hbase supergroup          0 2013-05-22 09:46 webhdfs://hdfs-upgrade-pseudo.ent.cloudera.com:50070/hbase
      drwxr-xr-x   - hdfs  supergroup          0 2013-05-22 09:46 webhdfs://hdfs-upgrade-pseudo.ent.cloudera.com:50070/tmp
      drwxr-xr-x   - hdfs  supergroup          0 2013-05-22 09:46 webhdfs://hdfs-upgrade-pseudo.ent.cloudera.com:50070/user
      2013-05-22 09:46:58,660 WARN  [Thread-3] util.ShutdownHookManager (ShutdownHookManager.java:run(56)) - ShutdownHook 'ClientFinalizer' failed, java.lang.IllegalStateException: Shutdown in progress, cannot add a shutdownHook
      java.lang.IllegalStateException: Shutdown in progress, cannot add a shutdownHook
      	at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:152)
      	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2400)
      	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2372)
      	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
      	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$DtRenewer.getWebHdfs(WebHdfsFileSystem.java:1001)
      	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$DtRenewer.cancel(WebHdfsFileSystem.java:1013)
      	at org.apache.hadoop.security.token.Token.cancel(Token.java:382)
      	at org.apache.hadoop.fs.DelegationTokenRenewer$RenewAction.cancel(DelegationTokenRenewer.java:152)
      	at org.apache.hadoop.fs.DelegationTokenRenewer$RenewAction.access$200(DelegationTokenRenewer.java:58)
      	at org.apache.hadoop.fs.DelegationTokenRenewer.removeRenewAction(DelegationTokenRenewer.java:241)
      	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.close(WebHdfsFileSystem.java:822)
      	at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2446)
      	at org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2463)
      	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
      

      I've checked that FsShell + hdfs:// commands and WebHDFS operations through curl work successfully:

      bash-4.1$ hadoop-3.0.0-SNAPSHOT/bin/hadoop fs -ls /
      2013-05-22 09:46:43,663 INFO  [main] util.Shell (Shell.java:isSetsidSupported(311)) - setsid exited with exit code 0
      Found 3 items
      drwxr-xr-x   - hbase supergroup          0 2013-05-22 09:46 /hbase
      drwxr-xr-x   - hdfs  supergroup          0 2013-05-22 09:46 /tmp
      drwxr-xr-x   - hdfs  supergroup          0 2013-05-22 09:46 /user
      bash-4.1$ curl -i --negotiate -u : "http://hdfs-upgrade-pseudo.ent.cloudera.com:50070/webhdfs/v1/?op=GETHOMEDIRECTORY"
      HTTP/1.1 401 
      Cache-Control: must-revalidate,no-cache,no-store
      Date: Wed, 22 May 2013 16:47:14 GMT
      Pragma: no-cache
      Date: Wed, 22 May 2013 16:47:14 GMT
      Pragma: no-cache
      Content-Type: text/html; charset=iso-8859-1
      WWW-Authenticate: Negotiate
      Set-Cookie: hadoop.auth=;Path=/;Expires=Thu, 01-Jan-1970 00:00:00 GMT
      Content-Length: 1358
      Server: Jetty(6.1.26)
      
      HTTP/1.1 200 OK
      Cache-Control: no-cache
      Expires: Thu, 01-Jan-1970 00:00:00 GMT
      Date: Wed, 22 May 2013 16:47:14 GMT
      Pragma: no-cache
      Date: Wed, 22 May 2013 16:47:14 GMT
      Pragma: no-cache
      Content-Type: application/json
      Set-Cookie: hadoop.auth="u=hdfs&p=hdfs/hdfs-upgrade-pseudo.ent.cloudera.com@ENT.CLOUDERA.COM&t=kerberos&e=1369277234852&s=m3vJ7/pV831tBLkpOBb0Naa5N+g=";Path=/
      Transfer-Encoding: chunked
      Server: Jetty(6.1.26)
      
      {"Path":"/user/hdfs"}bash-4.1$ 
      

      When I disable security, the warning goes away.

      I'll attach my core-site.xml, hdfs-site.xml, NN and DN output logs.

      Attachments

        1. core-site.xml
          0.6 kB
          Stephen Chu
        2. hadoop-root-namenode-hdfs-upgrade-pseudo.ent.cloudera.com.out
          43 kB
          Stephen Chu
        3. hdfs-site.xml
          3 kB
          Stephen Chu
        4. jsvc.out
          318 kB
          Stephen Chu
        5. HDFS-4841.patch
          0.8 kB
          Robert Kanter

        Activity

          People

            rkanter Robert Kanter
            schu Stephen Chu
            Votes:
            0 Vote for this issue
            Watchers:
            10 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: