Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-27754

[HBCK2] generateMissingTableDescriptorFile should throw write permission error and fail

    XMLWordPrintableJSON

Details

    Description

      Try running hbck2 generateMissingTableDescriptorFile with a user not having permissions to write to HDFS. 

      Actual
      The tool completes with success message, while it actually does not really generate/write the files, as it does not even have permissions.

      Expected
      Tool should throw error and should not log task is success 'Table descriptor written successfully. Orphan table xxxx fixed.'

      Debug dump

      Upon enabling debug logging, we can see incorrect behaviour.

      2023-03-24T19:03:16,890 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root sending #31 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
      2023-03-24T19:03:16,893 DEBUG [IPC Client (199657303) connection to hostname/ip_address:port_num from root] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root got value #31
      2023-03-24T19:03:16,894 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
      2023-03-24T19:03:16,894 DEBUG [main] hdfs.DFSClient: /apps/hbase/data/data/default/ittable-2090120905/.tmp/.tableinfo.0000000010: masked={ masked: rw-r--r--, unmasked: rw-rw-rw- }
      2023-03-24T19:03:16,895 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root sending #32 org.apache.hadoop.hdfs.protocol.ClientProtocol.create
      2023-03-24T19:03:16,897 DEBUG [IPC Client (199657303) connection to hostname/ip_address:port_num from root] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root got value #32
      2023-03-24T19:03:16,898 DEBUG [main] retry.RetryInvocationHandler: Exception while invoking call #32 ClientNamenodeProtocolTranslatorPB.create over null. Not retrying because try once and fail.
      org.apache.hadoop.ipc.RemoteException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
      	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
      	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
      	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
      	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
      	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
      	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)
      
      	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1587) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.Client.call(Client.java:1533) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.Client.call(Client.java:1430) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) ~[hadoop-common-hadoop_version.jar:?]
      	at com.sun.proxy.$Proxy25.create(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:372) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:java_version]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:java_version]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:java_version]
      	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:java_version]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-hadoop_version.jar:?]
      	at com.sun.proxy.$Proxy26.create(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1222) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1201) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1139) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:534) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:531) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:545) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:472) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1125) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1105) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:994) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hbase.HBCKFsTableDescriptors.writeTD(HBCKFsTableDescriptors.java:391) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCKFsTableDescriptors.writeTableDescriptor(HBCKFsTableDescriptors.java:365) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptorForTableDirectory(HBCKFsTableDescriptors.java:439) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptor(HBCKFsTableDescriptors.java:411) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.MissingTableDescriptorGenerator.generateTableDescriptorFileIfMissing(MissingTableDescriptorGenerator.java:93) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCK2.doCommandLine(HBCK2.java:1034) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCK2.run(HBCK2.java:830) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hbase.HBCK2.main(HBCK2.java:1145) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      2023-03-24T19:03:16,902 DEBUG [main] hbase.HBCKFsTableDescriptors: Failed write and/or rename; retrying
      org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
      	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
      	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
      	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
      	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
      	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
      	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)
      
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:java_version]
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:java_version]
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:java_version]
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:java_version]
      	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:281) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1222) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1201) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1139) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:534) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:531) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:545) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:472) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1125) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1105) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:994) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hbase.HBCKFsTableDescriptors.writeTD(HBCKFsTableDescriptors.java:391) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCKFsTableDescriptors.writeTableDescriptor(HBCKFsTableDescriptors.java:365) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptorForTableDirectory(HBCKFsTableDescriptors.java:439) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptor(HBCKFsTableDescriptors.java:411) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.MissingTableDescriptorGenerator.generateTableDescriptorFileIfMissing(MissingTableDescriptorGenerator.java:93) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCK2.doCommandLine(HBCK2.java:1034) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hbase.HBCK2.run(HBCK2.java:830) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hbase.HBCK2.main(HBCK2.java:1145) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
      Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
      	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
      	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
      	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
      	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
      	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
      	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
      	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
      	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
      	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)
      
      	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1587) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.Client.call(Client.java:1533) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.Client.call(Client.java:1430) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) ~[hadoop-common-hadoop_version.jar:?]
      	at com.sun.proxy.$Proxy25.create(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:372) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:java_version]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:java_version]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:java_version]
      	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:java_version]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-hadoop_version.jar:?]
      	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-hadoop_version.jar:?]
      	at com.sun.proxy.$Proxy26.create(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) ~[hadoop-hdfs-client-hadoop_version.jar:?]
      	... 21 more
      2023-03-24T19:03:16,907 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root sending #33 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
      2023-03-24T19:03:16,908 DEBUG [IPC Client (199657303) connection to hostname/ip_address:port_num from root] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root got value #33
      2023-03-24T19:03:16,908 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
      2023-03-24T19:03:16,908 WARN  [main] hbase.HBCKFsTableDescriptors: Failed cleanup of hdfs://hostname:port_num/apps/hbase/data/data/default/ittable-2090120905/.tmp/.tableinfo.0000000010
      2023-03-24T19:03:16,909 INFO  [main] hbase.MissingTableDescriptorGenerator: Table descriptor written successfully. Orphan table ittable-2090120905 fixed.
      

       

      Attachments

        Issue Links

          Activity

            People

              nihaljain.cs Nihal Jain
              nihaljain.cs Nihal Jain
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: