Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11991

spark_ec2.py does not perform sanity checks on hostnames

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.5.2
    • 2.0.0
    • EC2
    • None

    Description

      `ec2/spark_ec2.py` does not perform any sanity checks on hostnames when testing connectivity in `is_ssh_available` and descendants.

      This causes unexpected behavior when running a cluster in a VPC subnet without public IPs if `-private-ips` is not given. While `-private-ips` should be required in this context, the failure mode currently present is suboptimal.

      [ ... ]
      All 1 slaves granted
      Launched master in us-west-1c, regid = r-redacted
      Waiting for AWS to propagate instance metadata...
      Waiting for cluster to enter 'ssh-ready' state…………Password:

      What has happened here is that the public dns name for the instance is a null string, causing the ssh check later in the script to inadvertently connect to localhost to test connectivity to the cluster. The password prompt here is OS X's sshd asking to auth the user on that connection.

      Attachments

        Activity

          People

            jcderr Jeremy Derr
            jcderr Jeremy Derr
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: