Uploaded image for project: 'Metron (Retired)'
  1. Metron (Retired)
  2. METRON-1197

Profiler topology fails to write profile data to hbase table on a kerberised cluster

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Done
    • Major
    • Resolution: Done
    • 0.4.1
    • None
    • None

    Description

      Profiler fails to write the profile data to hbase table , I see 'javax.security.sasl.SaslException: GSS initiate failed' exception when the profiler tries to write the data to hbase table .

      I tried publishing below message to 'indexing' topic 5 times

      { "ip_src_addr": "10.0.0.1", "protocol": "HTTPS", "length": "10", "bytes_in": 234, "timestamp": "1505909543000" }
      

      below is the output for the console consumer for 'indexing' topic

      [metron@nat-r7-fsvs-metron-1 bin]$ ./kafka-console-consumer.sh  --zookeeper nat-r7-fsvs-metron-1.openstacklocal --topic indexing --security-protocol PLAINTEXTSASL
      {metadata.broker.list=nat-r7-fsvs-metron-3.openstacklocal:6667,nat-r7-fsvs-metron-7.openstacklocal:6667,nat-r7-fsvs-metron-9.openstacklocal:6667,nat-r7-fsvs-metron-11.openstacklocal:6667,nat-r7-fsvs-metron-8.openstacklocal:6667,nat-r7-fsvs-metron-5.openstacklocal:6667,nat-r7-fsvs-metron-12.openstacklocal:6667,nat-r7-fsvs-metron-1.openstacklocal:6667,nat-r7-fsvs-metron-6.openstacklocal:6667,nat-r7-fsvs-metron-2.openstacklocal:6667,nat-r7-fsvs-metron-10.openstacklocal:6667, request.timeout.ms=30000, client.id=console-consumer-83924, security.protocol=PLAINTEXTSASL}
      
      
      { "ip_src_addr": "10.0.0.1", "protocol": "HTTPS", "length": "10", "bytes_in": 234, "timestamp": "1505909543000" }
      { "ip_src_addr": "10.0.0.1", "protocol": "HTTPS", "length": "10", "bytes_in": 234, "timestamp": "1505909543000" }
      { "ip_src_addr": "10.0.0.1", "protocol": "HTTPS", "length": "10", "bytes_in": 234, "timestamp": "1505909543000" }
      { "ip_src_addr": "10.0.0.1", "protocol": "HTTPS", "length": "10", "bytes_in": 234, "timestamp": "1505909543000" }
      { "ip_src_addr": "10.0.0.1", "protocol": "HTTPS", "length": "10", "bytes_in": 234, "timestamp": "1505909543000" }
      
      {"period.start":1505916900000,"period":1673241,"enrichmentsplitterbolt.splitter.end.ts":"1505917778229","profile":"calender-effects","enrichmentsplitterbolt.splitter.begin.ts":"1505917778228","is_alert":"true","source.type":"profiler","threatintelsplitterbolt.splitter.end.ts":"1505917778234","threatinteljoinbolt.joiner.ts":"1505917778237","enrichmentjoinbolt.joiner.ts":"1505917778232","period.end":1505917800000,"threatintelsplitterbolt.splitter.begin.ts":"1505917778234","entity":"10.0.0.1","timestamp":1505917778104}
      {"period.start":1505916900000,"period":1673241,"enrichmentsplitterbolt.splitter.end.ts":"1505917778230","profile":"source_ip_counter","enrichmentsplitterbolt.splitter.begin.ts":"1505917778230","is_alert":"true","source.type":"profiler","threatintelsplitterbolt.splitter.end.ts":"1505917778235","threatinteljoinbolt.joiner.ts":"1505917778238","enrichmentjoinbolt.joiner.ts":"1505917778233","period.end":1505917800000,"threatintelsplitterbolt.splitter.begin.ts":"1505917778235","entity":"10.0.0.1","timestamp":1505917778105}
      

      After waiting for the specified 'profiler.period.duration', When profiler tries to write the data to hbase table Below is the exception on the profiler worker log

      2017-09-20 14:29:38.881 o.a.h.h.i.AbstractRpcClient [WARN] Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
      2017-09-20 14:29:38.882 o.a.h.h.i.AbstractRpcClient [ERROR] SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
      javax.security.sasl.SaslException: GSS initiate failed
      	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_144]
      	at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179) ~[stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:609) ~[stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735) ~[stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:732) ~[stormjar.jar:?]
      	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_144]
      	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_144]
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) ~[stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:732) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:885) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:854) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1180) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.ClientSmallScanner$SmallScannerCallable.call(ClientSmallScanner.java:201) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.ClientSmallScanner$SmallScannerCallable.call(ClientSmallScanner.java:180) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:369) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:343) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) [stormjar.jar:?]
      	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64) [stormjar.jar:?]
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_144]
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_144]
      	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
      Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
      	at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[?:1.8.0_144]
      	at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_144]
      	at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[?:1.8.0_144]
      	at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_144]
      	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_144]
      	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_144]
      	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_144]
      	... 25 more
      

      Attachments

        Activity

          People

            nickwallen Nick Allen
            mohandv Mohan Venkateshaiah
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: