Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-967

start-slaves.sh uses local path from master on remote slave nodes

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Trivial
    • Resolution: Not A Problem
    • 0.8.0, 0.8.1, 0.9.0
    • None
    • Deploy

    Description

      If a slave node has home path other than master, start-slave.sh fails to start a worker instance, for other nodes behaves as expected, in my case:

      $ ./bin/start-slaves.sh
      node05.dev.vega.ru: bash: line 0: cd: /usr/home/etsvigun/spark/bin/..: No such file or directory
      node04.dev.vega.ru: org.apache.spark.deploy.worker.Worker running as process 4796. Stop it first.
      node03.dev.vega.ru: org.apache.spark.deploy.worker.Worker running as process 61348. Stop it first.

      I don't mention /usr/home anywhere, the only environment variable I set is $SPARK_HOME, relative to $HOME on every node, which makes me think some script takes `pwd` on master and tries to use it on slaves.

      Spark version: fb6875dd5c9334802580155464cef9ac4d4cc1f0
      OS: FreeBSD 8.4

      Attachments

        Activity

          People

            Unassigned Unassigned
            etsvigun Evgeniy Tsvigun
            Votes:
            2 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: