Details
-
Bug
-
Status: Resolved
-
Blocker
-
Resolution: Fixed
-
None
-
None
Description
See this exception in SHS log:
======================================== Warning: Master yarn-client is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead. Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Keytab file: none does not exist at scala.Predef$.require(Predef.scala:224) at org.apache.spark.deploy.SparkSubmit$.doPrepareSubmitEnvironment(SparkSubmit.scala:390) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:250) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:171) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
After i've removed spark.yarn.keytab/principal properties, it start work fine. By the way, this cluster is NOT kerberized. It's strange why SHS is trying to use these properties. In the same time properties spark.history.kerberos.keytab/principal also available but there is no issues. I expect question why spark.yarn.keytab/principal were added during stack upgrade if cluster is not kerberized, here is answer:
<transfer operation="copy" from-type="spark2-defaults" from-key="spark.history.kerberos.keytab" to-key="spark.yarn.keytab" default-value="" if-type="spark2-thrift-sparkconf" if-key="spark.yarn.keytab" if-key-state="absent"/> <transfer operation="copy" from-type="spark2-defaults" from-key="spark.history.kerberos.principal" to-key="spark.yarn.principal" default-value="" if-type="spark2-thrift-sparkconf" if-key="spark.yarn.principal" if-key-state="absent"/>
I thought if "spark.history.kerberos.keyta/principal" is available in non kerberized cluster then "spark.yarn.keytab/principal" could be added too. Also we have same logic for many other components in ambari. So the question should it be fixed on ambari side, i mean add spark.yarn.keytab/principal only if kerberos enabled or some condition should be modified/added on SPARK side, not to use it if kerberos disabled or value empty/none?