Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-24718

STS fails after start, after stack upgrade from 3.0.1 to 3.0.3

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • None
    • 2.7.3
    • None

    Description

      See this exception in SHS log:

      ========================================
      Warning: Master yarn-client is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
      Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Keytab file: none does not exist
       at scala.Predef$.require(Predef.scala:224)
       at org.apache.spark.deploy.SparkSubmit$.doPrepareSubmitEnvironment(SparkSubmit.scala:390)
       at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:250)
       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:171)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      

      After i've removed spark.yarn.keytab/principal properties, it start work fine. By the way, this cluster is NOT kerberized. It's strange why SHS is trying to use these properties. In the same time properties spark.history.kerberos.keytab/principal also available but there is no issues. I expect question why spark.yarn.keytab/principal were added during stack upgrade if cluster is not kerberized, here is answer:

      <transfer operation="copy" from-type="spark2-defaults" from-key="spark.history.kerberos.keytab" to-key="spark.yarn.keytab" default-value="" if-type="spark2-thrift-sparkconf" if-key="spark.yarn.keytab" if-key-state="absent"/>
       <transfer operation="copy" from-type="spark2-defaults" from-key="spark.history.kerberos.principal" to-key="spark.yarn.principal" default-value="" if-type="spark2-thrift-sparkconf" if-key="spark.yarn.principal" if-key-state="absent"/>
      

      I thought if "spark.history.kerberos.keyta/principal" is available in non kerberized cluster then "spark.yarn.keytab/principal" could be added too. Also we have same logic for many other components in ambari. So the question should it be fixed on ambari side, i mean add spark.yarn.keytab/principal only if kerberos enabled or some condition should be modified/added on SPARK side, not to use it if kerberos disabled or value empty/none?

       

      Attachments

        Activity

          People

            vbrodetskyi Vitaly Brodetskyi
            vbrodetskyi Vitaly Brodetskyi
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

                Estimated:
                Original Estimate - Not Specified
                Not Specified
                Remaining:
                Remaining Estimate - 0h
                0h
                Logged:
                Time Spent - 2.5h
                2.5h