Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19997

proxy-user failed connecting to a kerberos configured metastore

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 2.1.0
    • None
    • SQL
    • None

    Description

      Start runing spark-sql via proxy-user on a kerberos configured hadoop cluster and metastore

      bin/spark-sql --proxy-user hzyaoqin
      

      Failed with the following err:

      17/03/17 16:05:41 INFO hive.metastore: Trying to connect to metastore with URI thrift://xxxxxxx:9083
      17/03/17 16:05:41 ERROR transport.TSaslTransport: SASL negotiation failure
      javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
      	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
      	at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
      	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
      	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
      	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
      	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:415)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
      	at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
      	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
      	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
      	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
      	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
      	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
      	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
      	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
      	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
      	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
      	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
      	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
      	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
      	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
      	at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      	at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
      	at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
      	at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
      	at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
      	at scala.Option.getOrElse(Option.scala:121)
      	at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
      	at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
      	at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
      	at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      	at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
      	at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
      	at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
      	at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
      	at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
      	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
      	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
      	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
      	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
      	at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
      	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
      	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:47)
      	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:288)
      	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
      	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
      	at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:169)
      	at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:167)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:415)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
      	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
      	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
      	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
      	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
      	at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
      	at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
      	at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
      	at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
      	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
      	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
      	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
      	... 81 more
      17/03/17 16:05:41 WARN hive.metastore: Failed to connect to the MetaStore Server...
      

      Same err occurs for:

      sbin/start-thriftserver.sh --proxy-user hzyaoqin
      bin/spark-shell --proxy-user hzyaoqin
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              Qin Yao Kent Yao 2
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: