Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-919

spark-ec2 launch --resume doesn't re-initialize all modules

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 0.8.0
    • 0.8.1
    • EC2
    • None

    Description

      I launched a Spark cluster using the new EC2 scripts, stopped it with stop, then restarted it with start and ran launch --resume to re-deploy the Spark configurations.

      It looks like the script exits after initializing Spark if it's already installed:

      Initializing spark
      ~ ~/spark-ec2
      Spark seems to be installed. Exiting.
      Connection to ec2-*-.amazonaws.com closed.
      Spark standalone cluster started at http://ec2-*.compute-1.amazonaws.com:8080
      Ganglia started at http://ec2-*-1.amazonaws.com:5080/ganglia
      Done!
      

      It looks like the problem is that the module init scripts are run by sourcing them into the top-level setup.sh script rather than running them in subshells, so the module script's own exit command kills the top-level setup script:

      # Install / Init module
      for module in $MODULES; do
        echo "Initializing $module"
        if [[ -e $module/init.sh ]]; then
          source $module/init.sh
        fi
      done
      

      Attachments

        Activity

          People

            joshrosen Josh Rosen
            joshrosen Josh Rosen
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: