Uploaded image for project: 'Atlas'
  1. Atlas
  2. ATLAS-335

Kerberized cluster: Atlas fails to come up with hbase as backend

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 0.6-incubating
    • 0.6-incubating
    • None
    • None

    Description

      With the secure cluster deployed using ambari, I tried following the steps mentioned in the below doc(setting hbase as storage backend) and looks like atlas is failing to come up with GSSException.

      From the below logs looks like "kinit"(authentication) is not done by ambari. Isn't this supposed to be done by ambari?

      2015-11-23 11:22:14,100 WARN  - [hconnection-0x1b969687-shared--pool1-t1:] ~ Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] (AbstractRpcClient:699)
      2015-11-23 11:22:14,100 FATAL - [hconnection-0x1b969687-shared--pool1-t1:] ~ SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. (AbstractRpcClient:709)
      javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
              at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
              at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:642)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:166)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:769)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:766)
              at java.security.AccessController.doPrivileged(Native Method)
              at javax.security.auth.Subject.doAs(Subject.java:415)
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:766)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889)
              at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222)
              at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
              at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
              at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651)
              at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372)
              at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
              at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
              at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
              at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:346)
              at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:320)
              at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
              at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              at java.lang.Thread.run(Thread.java:745)
      Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
              at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
              at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
              at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
              at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
              at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
              at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
              at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
              ... 26 more
      

      I tried this "kinit" step manually through command line as user "atlas"
      kinit -k -t /etc/security/keytabs/atlas.service.keytab atlas/os-u14-testing-1-atlas-1.novalocal@HWQE.HORTONWORKS.COM

      After this step, restarting atlas through ambari UI results in new exception.

      Caused by: org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'atlas/os-u14-testing-1-atlas-1.novalocal@HWQE.HORTONWORKS.COM' (action=create)
              at org.apache.ranger.authorization.hbase.AuthorizationSession.publishResults(AuthorizationSession.java:254)
              at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.authorizeAccess(RangerAuthorizationCoprocessor.java:592)
              at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.requirePermission(RangerAuthorizationCoprocessor.java:657)
              at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.preCreateTable(RangerAuthorizationCoprocessor.java:762)
              at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.preCreateTable(RangerAuthorizationCoprocessor.java:493)
              at org.apache.hadoop.hbase.master.MasterCoprocessorHost$11.call(MasterCoprocessorHost.java:213)
              at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1095)
              at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preCreateTable(MasterCoprocessorHost.java:209)
              at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1517)
              at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:449)
              at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:51097)
              at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
              at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
              at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
              at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
              at java.lang.Thread.run(Thread.java:745)
      
              at sun.reflect.GeneratedConstructorAccessor10.newInstance(Unknown Source)
              at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
              at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
              at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
              at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
              at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
              at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
              at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
              at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
              at org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
              at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
              at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:490)
              at com.thinkaurelius.titan.diskstorage.hbase.HBaseAdmin1_0.createTable(HBaseAdmin1_0.java:84)
              at com.thinkaurelius.titan.diskstorage.hbase.HBaseStoreManager.createTable(HBaseStoreManager.java:743)
              at com.thinkaurelius.titan.diskstorage.hbase.HBaseStoreManager.ensureTableExists(HBaseStoreManager.java:707)
              ... 101 more
      

      For the above "Insufficient permissions for user" exception, looks like we have to add a policy in ranger under hbase policies for providing permissions.

      Shouldn't we create this policy automatically as part of atlas deployment?

      snapshot: https://monosnap.com/file/HrKk9dU2u3p9ZONodju5WDCDVeaNoO

      Attachments

        1. ATLAS-335.patch
          2 kB
          Suma Shivaprasad

        Activity

          People

            suma.shivaprasad Suma Shivaprasad
            ayubpathan Ayub Pathan
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: