Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17944

sbin/start-* scripts use of `hostname -f` fail with Solaris

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 2.0.1
    • 2.1.0
    • Deploy
    • None
    • Solaris 10, Solaris 11

    Description

      $SPARK_HOME/sbin/start-master.sh fails:

      $ ./start-master.sh 
      usage: hostname [[-t] system_name]
             hostname [-D]
      starting org.apache.spark.deploy.master.Master, logging to /home/eoshaugh/local/spark/logs/spark-eoshaugh-org.apache.spark.deploy.master.Master-1-m7-16-002-ld1.out
      failed to launch org.apache.spark.deploy.master.Master:
          --properties-file FILE Path to a custom Spark properties file.
                                 Default is conf/spark-defaults.conf.
      full log in /home/eoshaugh/local/spark/logs/spark-eoshaugh-org.apache.spark.deploy.master.Master-1-m7-16-002-ld1.out
      

      I found SPARK-17546 which changed the invocation of hostname in sbin/start-master.sh, sbin/start-slaves.sh and sbin/start-mesos-dispatcher.sh to include the flag -f, which is not a valid command line option for the Solaris hostname implementation.

      As a workaround, Solaris users can substitute:

      `/usr/sbin/check-hostname | awk '{print $NF}'`
      

      Admittedly not an obvious fix, but it provides equivalent functionality.

      Attachments

        Activity

          People

            erik.oshaughnessy Erik O'Shaughnessy
            erik.oshaughnessy Erik O'Shaughnessy
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: