Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-10150 Hadoop cryptographic file system
  3. HADOOP-10870

Failed to load OpenSSL cipher error logs on systems with old openssl versions

    XMLWordPrintableJSON

Details

    Description

      I built Hadoop from fs-encryption branch and deployed Hadoop (without enabling any security confs) on a Centos 6.4 VM with an old version of openssl.

      [root@schu-enc hadoop-common]# rpm -qa | grep openssl
      openssl-1.0.0-27.el6_4.2.x86_64
      openssl-devel-1.0.0-27.el6_4.2.x86_64
      

      When I try to do a simple "hadoop fs -ls", I get

      [hdfs@schu-enc hadoop-common]$ hadoop fs -ls
      2014-07-21 19:35:14,486 ERROR [main] crypto.OpensslCipher (OpensslCipher.java:<clinit>(87)) - Failed to load OpenSSL Cipher.
      java.lang.UnsatisfiedLinkError: Cannot find AES-CTR support, is your version of Openssl new enough?
      	at org.apache.hadoop.crypto.OpensslCipher.initIDs(Native Method)
      	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84)
      	at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
      	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:55)
      	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:591)
      	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:561)
      	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
      	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2590)
      	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
      	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2624)
      	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
      	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
      	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
      	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
      	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
      	at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
      	at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:228)
      	at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:211)
      	at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:194)
      	at org.apache.hadoop.fs.shell.Command.run(Command.java:155)
      	at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
      	at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
      2014-07-21 19:35:14,495 WARN  [main] crypto.CryptoCodec (CryptoCodec.java:getInstance(66)) - Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
      

      It would be an improvment to clean up/shorten this error log.

      hadoop checknative shows the error as well

      [hdfs@schu-enc ~]$ hadoop checknative
      2014-07-21 19:38:38,376 INFO  [main] bzip2.Bzip2Factory (Bzip2Factory.java:isNativeBzip2Loaded(70)) - Successfully loaded & initialized native-bzip2 library system-native
      2014-07-21 19:38:38,395 INFO  [main] zlib.ZlibFactory (ZlibFactory.java:<clinit>(49)) - Successfully loaded & initialized native-zlib library
      2014-07-21 19:38:38,411 ERROR [main] crypto.OpensslCipher (OpensslCipher.java:<clinit>(87)) - Failed to load OpenSSL Cipher.
      java.lang.UnsatisfiedLinkError: Cannot find AES-CTR support, is your version of Openssl new enough?
      	at org.apache.hadoop.crypto.OpensslCipher.initIDs(Native Method)
      	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84)
      	at org.apache.hadoop.util.NativeLibraryChecker.main(NativeLibraryChecker.java:82)
      Native library checking:
      hadoop:  true /home/hdfs/hadoop-3.0.0-SNAPSHOT/lib/native/libhadoop.so.1.0.0
      zlib:    true /lib64/libz.so.1
      snappy:  true /usr/lib64/libsnappy.so.1
      lz4:     true revision:99
      bzip2:   true /lib64/libbz2.so.1
      openssl: false 
      

      Thanks to cmccabe who identified this issue as a bug.

      Attachments

        1. HADOOP-10870-fs-enc.001.patch
          8 kB
          Colin McCabe

        Activity

          People

            cmccabe Colin McCabe
            schu Stephen Chu
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: