Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-16930

HoS should verify the value of Kerberos principal and keytab file before adding them to spark-submit command parameters

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 3.0.0, 2.4.0, 2.3.2
    • Component/s: Spark
    • Labels:
      None

      Description

      When Kerberos is enabled, Hive CLI fails to run Hive on Spark queries:

      >hive -e "set hive.execution.engine=spark; create table if not exists test(a int); select count(*) from test" --hiveconf hive.root.logger=INFO,console > /var/tmp/hive_log.txt > /var/tmp/hive_log_2.txt 
      
      
      17/06/16 16:13:13 [main]: ERROR client.SparkClientImpl: Error while waiting for client to connect. 
      java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'a5de85d1-6933-43e7-986f-5f8e5c001b5f'. Error: Child process exited before connecting back with error log Error: Cannot load main class from JAR file:/tmp/spark-submit.7196051517706529285.properties 
      Run with --help for usage help or --verbose for debug output 
      
              at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) 
              at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:107) 
              at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) 
              at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:100) 
              at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:96) 
              at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:66) 
              at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) 
              at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) 
              at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:111) 
              at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) 
              at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) 
              at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
              at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1972) 
              at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1685) 
              at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1421) 
              at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1205) 
              at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1195) 
              at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:220) 
              at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:172) 
              at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:383) 
              at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:318) 
              at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:720) 
              at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:693) 
              at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628) 
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
              at java.lang.reflect.Method.invoke(Method.java:606) 
              at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
              at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
      Caused by: java.lang.RuntimeException: Cancel client 'a5de85d1-6933-43e7-986f-5f8e5c001b5f'. Error: Child process exited before connecting back with error log Error: Cannot load main class from JAR file:/tmp/spark-submit.7196051517706529285.properties 
      Run with --help for usage help or --verbose for debug output 
      
              at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179) 
              at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:490) 
              at java.lang.Thread.run(Thread.java:745) 
      17/06/16 16:13:13 [Driver]: WARN client.SparkClientImpl: Child process exited with code 1 
      

      In the log, below message shows up:

      17/06/16 16:13:12 [main]: INFO client.SparkClientImpl: Running client driver with argv: /usr/lib/spark/bin/spark-submit --executor-cores 1 --executor-memory 268435456 --principal hive/nightly58-1.gce.cloudera.com@GCE.CLOUDERA.COM --keytab  --properties-file /tmp/spark-submit.7196051517706529285.properties --class org.apache.hive.spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.1.0-cdh5.8.5.jar --remote-host nightly58-1.gce.cloudera.com --remote-port 36074 --conf hive.spark.client.connect.timeout=1000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null
      

      There isn't any value after parameter "--keytab", which makes the spark-submit command syntax wrong.

      Hive should verifies the setting before use it.

        Attachments

        1. HIVE-16930.1.patch
          3 kB
          Yibing Shi

          Issue Links

            Activity

              People

              • Assignee:
                Yibing Yibing Shi
                Reporter:
                Yibing Yibing Shi
              • Votes:
                0 Vote for this issue
                Watchers:
                5 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: