Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.6.0
    • Fix Version/s: 0.7.0
    • Component/s: deployment
    • Labels:
      None

      Description

      Deployment recipes need to be update to install and configure Spark standalone cluster.

        Issue Links

          Activity

          Hide
          Konstantin Boudnik added a comment -

          The recipe should be pretty trivial. Spark standalone cluster doesn't need much, but a master node name and port number (could be the same as cluster head node with standard port 7077).

          The every Spark worker will have to specify
          {{ STANDALONE_SPARK_MASTER_HOST}} in the
          /etc/spark/conf/spark-env.sh and the cluster will get up properly.

          Show
          Konstantin Boudnik added a comment - The recipe should be pretty trivial. Spark standalone cluster doesn't need much, but a master node name and port number (could be the same as cluster head node with standard port 7077). The every Spark worker will have to specify {{ STANDALONE_SPARK_MASTER_HOST}} in the /etc/spark/conf/spark-env.sh and the cluster will get up properly.
          Hide
          Roman Shaposhnik added a comment -

          Please review

          Show
          Roman Shaposhnik added a comment - Please review
          Hide
          Konstantin Boudnik added a comment -

          +1 - patch looks right, although I haven't test it yet.

          Show
          Konstantin Boudnik added a comment - +1 - patch looks right, although I haven't test it yet.

            People

            • Assignee:
              Roman Shaposhnik
              Reporter:
              Konstantin Boudnik
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development