Details
-
Improvement
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
1.2.2
-
None
-
None
-
spark : spark-1.6.0-cdh5.16.1
hive : hive-1.1.0-cdh5.16.1
hadoop : hadoop-2.6.0-cdh5.16.2
Description
Hive on Spark job failed with below error message, this happens only when cluster is busy:
Excerpt from exception stack from one of the YARN applications.
19/12/02 16:19:00 ERROR yarn.ApplicationMaster: Uncaught exception:
java.util.concurrent.ExecutionException: javax.security.sasl.SaslException: Client closed before SASL negotiation finished.
at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
at org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:156)
at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:567)
Caused by: javax.security.sasl.SaslException: Client closed before SASL negotiation finished.
at org.apache.hive.spark.client.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:455)
at org.apache.hive.spark.client.rpc.SaslHandler.channelInactive(SaslHandler.java 90)
Is it helpful with retry?