Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-15412

Hadoop KMS with HDFS keystore: No FileSystem for scheme "hdfs"

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Won't Fix
    • 2.7.2, 2.9.0
    • None
    • kms
    • None
    • RHEL 7.3

      Hadoop 2.7.2 and 2.7.9

       

    Description

      I have been trying to configure the Hadoop kms to use hdfs as the key provider but it seems that this functionality is failing. 

      I followed the Hadoop docs for that matter, and I added the following field to my kms-site.xml:

      <property> 
         <name>hadoop.kms.key.provider.uri</name>
         <value>jceks://hdfs@nn1.example.com/kms/test.jceks</value> 
         <description> 
            URI of the backing KeyProvider for the KMS. 
         </description> 
      </property>

      That route exists in hdfs, and I expect the kms to create the file test.jceks for its keystore. However, the kms failed to start due to this error:

      ERROR: Hadoop KMS could not be started REASON: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "hdfs" Stacktrace: --------------------------------------------------- org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "hdfs" at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3220) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3240) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:121) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3291) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3259) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:470) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356) at org.apache.hadoop.crypto.key.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:132) at org.apache.hadoop.crypto.key.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:88) at org.apache.hadoop.crypto.key.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:660) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:96) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:187) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4276) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4779) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:803) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:780) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583) at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1080) at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:1003) at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:507) at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1322) at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:325) at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142) at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1069) at org.apache.catalina.core.StandardHost.start(StandardHost.java:822) at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1061) at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at org.apache.catalina.core.StandardService.start(StandardService.java:525) at org.apache.catalina.core.StandardServer.start(StandardServer.java:761) at org.apache.catalina.startup.Catalina.start(Catalina.java:595) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)

       
      For what I could manage to understand, it seems that this error is because there is no FileSystem implemented for HDFS. I have looked up this error but it always refers to a lack of jars for the hdfs-client when upgrading, which I have not done (it is a fresh installation). I have tested it using Hadoop 2.7.2 and 2.9.0

      Thank you in advance.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              pablosjv Pablo San José
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: