Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-22640

HBase Cannot Find LZO Classes After Being Patched

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 2.6.1
    • 2.6.2
    • None
    • None

    Description

      After patching HBase where LZO compression is being used, the following is seen:

      2017-12-10 22:31:09,244|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzoCodec
      2017-12-10 22:31:09,245|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.compress.Compression$Algorithm$1.buildCodec(Compression.java:130)
      2017-12-10 22:31:09,245|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.compress.Compression$Algorithm$1.getCodec(Compression.java:116)
      2017-12-10 22:31:09,245|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:301)
      2017-12-10 22:31:09,245|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:90)
      2017-12-10 22:31:09,245|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:853)
      2017-12-10 22:31:09,246|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:121)
      2017-12-10 22:31:09,246|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:113)
      2017-12-10 22:31:09,246|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.hfile.HFileWriterV3.<init>(HFileWriterV3.java:67)
      2017-12-10 22:31:09,246|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.hfile.HFileWriterV3$WriterFactoryV3.createWriter(HFileWriterV3.java:59)
      2017-12-10 22:31:09,246|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:325)
      2017-12-10 22:31:09,246|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:127)
      2017-12-10 22:31:09,247|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:160)
      2017-12-10 22:31:09,247|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|Caused by: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzoCodec
      2017-12-10 22:31:09,247|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
      2017-12-10 22:31:09,247|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
      2017-12-10 22:31:09,247|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
      2017-12-10 22:31:09,248|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      2017-12-10 22:31:09,248|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hbase.io.compress.Compression$Algorithm$1.buildCodec(Compression.java:126)
      2017-12-10 22:31:09,248|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|... 11 more
      2017-12-10 22:31:09,260|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|2017-12-10 22:31:09,258 ERROR [pool-1-thread-1] hdfs.DFSClient: Failed to close inode 18114
      2017-12-10 22:31:09,261|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException): No lease on /user/hrt_qa/CompressionInput.txt (inode 18114): File does not exist. Holder DFSClient_NONMAPREDUCE_906196463_1 does not have any open files.
      2017-12-10 22:31:09,261|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:3693)
      2017-12-10 22:31:09,261|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFileInternal(FSNamesystem.java:3781)
      2017-12-10 22:31:09,261|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFile(FSNamesystem.java:3748)
      2017-12-10 22:31:09,261|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.complete(NameNodeRpcServer.java:912)
      2017-12-10 22:31:09,262|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.complete(ClientNamenodeProtocolServerSideTranslatorPB.java:549)
      2017-12-10 22:31:09,262|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      2017-12-10 22:31:09,262|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
      2017-12-10 22:31:09,262|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
      2017-12-10 22:31:09,262|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
      2017-12-10 22:31:09,262|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
      2017-12-10 22:31:09,263|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.security.AccessController.doPrivileged(Native Method)
      2017-12-10 22:31:09,263|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at javax.security.auth.Subject.doAs(Subject.java:422)
      2017-12-10 22:31:09,263|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
      2017-12-10 22:31:09,263|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
      2017-12-10 22:31:09,263|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|
      2017-12-10 22:31:09,263|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554)
      2017-12-10 22:31:09,264|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.Client.call(Client.java:1498)
      2017-12-10 22:31:09,264|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.Client.call(Client.java:1398)
      2017-12-10 22:31:09,264|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
      2017-12-10 22:31:09,264|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at com.sun.proxy.$Proxy10.complete(Unknown Source)
      2017-12-10 22:31:09,264|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.complete(ClientNamenodeProtocolTranslatorPB.java:503)
      2017-12-10 22:31:09,265|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2017-12-10 22:31:09,265|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      2017-12-10 22:31:09,265|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      2017-12-10 22:31:09,265|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.lang.reflect.Method.invoke(Method.java:498)
      2017-12-10 22:31:09,265|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)
      2017-12-10 22:31:09,266|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)
      2017-12-10 22:31:09,266|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)
      2017-12-10 22:31:09,266|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at com.sun.proxy.$Proxy11.complete(Unknown Source)
      2017-12-10 22:31:09,266|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2496)
      2017-12-10 22:31:09,266|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2472)
      2017-12-10 22:31:09,266|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2437)
      2017-12-10 22:31:09,267|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.DFSClient.closeAllFilesBeingWritten(DFSClient.java:949)
      2017-12-10 22:31:09,267|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.DFSClient.closeOutputStreams(DFSClient.java:981)
      2017-12-10 22:31:09,267|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:1211)
      2017-12-10 22:31:09,267|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2886)
      2017-12-10 22:31:09,267|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2903)
      2017-12-10 22:31:09,267|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      2017-12-10 22:31:09,268|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      2017-12-10 22:31:09,268|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2017-12-10 22:31:09,268|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      2017-12-10 22:31:09,268|INFO|MainThread|machine.py:164 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|at java.lang.Thread.run(Thread.java:745)
      2017-12-10 22:31:09,741|INFO|MainThread|machine.py:189 - run()||GUID=37b565a7-e164-4641-b335-19884c614ffd|Exit Code: 1
      2017-12-10 22:31:19,776|INFO|MainThread|conftest.py:242 - pytest_report_teststatus()|TEST "test_CompressionTool[lzo]" FAILED in 13.87 seconds
      

      Attachments

        1. AMBARI-22640.patch
          2 kB
          Jonathan Hurley

        Issue Links

          Activity

            People

              jonathanhurley Jonathan Hurley
              jonathanhurley Jonathan Hurley
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: