Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-8344

Fix vagrant-up.sh to work with AWS properly

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 2.3.0
    • system tests
    • None

    Description

      I tried to run vagrant/vagrant-up.sh --aws with the following Vagrantfile.local.

      enable_dns = true
      enable_hostmanager = false
      
      # EC2
      ec2_access_key = "********************"
      ec2_secret_key = "****************************************"
      ec2_keypair_name = "keypair"
      ec2_keypair_file = "/path/to/keypair/file"
      ec2_region = "ap-northeast-1"
      ec2_ami = "ami-0905ffddadbfd01b7"
      ec2_security_groups = "sg-********"
      ec2_subnet_id = "subnet-********"
      

      EC2 instances were successfully created, but it failed with the following error after that.

      $ vagrant/vagrant-up.sh --aws
      
      (snip)
      
      An active machine was found with a different provider. Vagrant
      currently allows each machine to be brought up with only a single
      provider at a time. A future version will remove this limitation.
      Until then, please destroy the existing machine to up with a new
      provider.
      
      Machine name: zk1
      Active provider: aws
      Requested provider: virtualbox
      

      It seems that the vagrant hostmanager command also requires --provider=aws option, in addition to vagrant up.
      With that option, it succeeded as follows:

      $ git diff
      diff --git a/vagrant/vagrant-up.sh b/vagrant/vagrant-up.sh
      index 6a4ef9564..9210a5357 100755
      --- a/vagrant/vagrant-up.sh
      +++ b/vagrant/vagrant-up.sh
      @@ -220,7 +220,7 @@ function bring_up_aws {
                   # We still have to bring up zookeeper/broker nodes serially
                   echo "Bringing up zookeeper/broker machines serially"
                   vagrant up --provider=aws --no-parallel --no-provision $zk_broker_machines $debug
      -            vagrant hostmanager
      +            vagrant hostmanager --provider=aws
                   vagrant provision
               fi
      
      @@ -231,11 +231,11 @@ function bring_up_aws {
                   local vagrant_rsync_temp_dir=$(mktemp -d);
                   TMPDIR=$vagrant_rsync_temp_dir vagrant_batch_command "vagrant up $debug --provider=aws" "$worker_machines" "$max_parallel"
                   rm -rf $vagrant_rsync_temp_dir
      -            vagrant hostmanager
      +            vagrant hostmanager --provider=aws
               fi
           else
               vagrant up --provider=aws --no-parallel --no-provision $debug
      -        vagrant hostmanager
      +        vagrant hostmanager --provider=aws
               vagrant provision
           fi
      
      $ vagrant/vagrant-up.sh --aws
      
      (snip)
      
      ==> broker3: Running provisioner: shell...
          broker3: Running: /tmp/vagrant-shell20190509-25399-8f1wgz.sh
          broker3: Killing server
          broker3: No kafka server to stop
          broker3: Starting server
      $ vagrant status
      Current machine states:
      
      zk1                       running (aws)
      broker1                   running (aws)
      broker2                   running (aws)
      broker3                   running (aws)
      
      This environment represents multiple VMs. The VMs are all listed
      above with their current state. For more information about a specific
      VM, run `vagrant status NAME`.
      $ vagrant ssh broker1
      
      (snip)
      
      ubuntu@ip-172-16-0-62:~$ /opt/kafka-dev/bin/kafka-topics.sh --bootstrap-server broker1:9092,broker2:9092,broker3:9092 --create --partitions 1 --replication-factor 3 --topic sandbox
      
      (snip)
      
      ubuntu@ip-172-16-0-62:~$ /opt/kafka-dev/bin/kafka-topics.sh --bootstrap-server broker1:9092,broker2:9092,broker3:9092 --list
      
      (snip)
      
      sandbox
      

      Attachments

        Issue Links

          Activity

            People

              sekikn Kengo Seki
              sekikn Kengo Seki
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: