Using a fresh checkout of 1.4.0-bin-hadoop2.6
if you run
./start-slave.sh 1 spark://localhost:7077
failed to launch org.apache.spark.deploy.worker.Worker:
Default is conf/spark-defaults.conf.
15/06/16 13:11:08 INFO Utils: Shutdown hook called
it seems the worker number is not being accepted as desccribed here:
The documentation says:
./sbin/start-slave.sh <worker#> <master-spark-URL>
but the start.slave-sh script states:
usage="Usage: start-slave.sh <spark-master-URL> where <spark-master-URL> is like spark://localhost:7077"
I have checked for similar issues using :
and found nothing similar so am raising this as an issue.