Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8395

spark-submit documentation is incorrect

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Closed
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 1.4.0
    • Fix Version/s: 1.4.1, 1.5.0
    • Component/s: Documentation
    • Labels:
      None

      Description

      Using a fresh checkout of 1.4.0-bin-hadoop2.6

      if you run
      ./start-slave.sh 1 spark://localhost:7077

      you get
      failed to launch org.apache.spark.deploy.worker.Worker:
      Default is conf/spark-defaults.conf.
      15/06/16 13:11:08 INFO Utils: Shutdown hook called

      it seems the worker number is not being accepted as desccribed here:
      https://spark.apache.org/docs/latest/spark-standalone.html

      The documentation says:
      ./sbin/start-slave.sh <worker#> <master-spark-URL>

      but the start.slave-sh script states:
      usage="Usage: start-slave.sh <spark-master-URL> where <spark-master-URL> is like spark://localhost:7077"

      I have checked for similar issues using :
      https://issues.apache.org/jira/browse/SPARK-6552?jql=text%20~%20%22start-slave%22

      and found nothing similar so am raising this as an issue.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                srowen Sean R. Owen
                Reporter:
                devl.development Dev Lakhani
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: