Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-23000

Timeline Service V2.0 reader install fails if wget is not already installed on the host

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 3.0.0
    • None

    Description

      Encountered while testing Atlantic Beta 1.
      Timeline Service V2.0 reader install fails if wget is already not installed on
      the host beforehand.

      stderr:
      Traceback (most recent call last):
      File "/var/lib/ambari-agent/cache/common-services/YARN/3.0.0.3.0/package/scripts/timelinereader.py", line 101, in <module>
      ApplicationTimelineReader().execute()
      File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in execute
      method(env)
      File "/var/lib/ambari-agent/cache/common-services/YARN/3.0.0.3.0/package/scripts/timelinereader.py", line 45, in install
      hbase_service.install_hbase(env)
      File "/var/lib/ambari-agent/cache/common-services/YARN/3.0.0.3.0/package/scripts/hbase_service.py", line 82, in install_hbase
      Execute(hbase_download_cmd, user="root", logoutput=True)
      File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in _init_
      self.env.run()
      File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
      self.run_action(resource, action)
      File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
      provider_action()
      File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
      tries=self.resource.tries, try_sleep=self.resource.try_sleep)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
      result = function(command, **kwargs)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
      tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
      result = _call(command, **kwargs_copy)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
      raise ExecutionFailed(err_msg, code, out, err)
      resource_management.core.exceptions.ExecutionFailed: Execution of 'umask 0022;wget -no-cookies --no-check-certificate http://public-repo-1.hortonworks.com/ARTIFACTS/dist/hbase/1.2.6/hadoop-2.7.5/hbase-1.2.6-bin.tar.gz && tar -xzf hbase-1.2.6-bin.tar.gz && rm -rf hbase-1.2.6-bin.tar.gz && rm -rf /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase && mv hbase* /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase' returned 127. -bash: wget: command not found
      stdout:
      2018-02-13 07:39:31,705 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
      2018-02-13 07:39:31,711 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2018-02-13 07:39:31,713 - Group['livy'] {}
      2018-02-13 07:39:31,714 - Group['spark'] {}
      2018-02-13 07:39:31,714 - Group['hdfs'] {}
      2018-02-13 07:39:31,714 - Group['zeppelin'] {}
      2018-02-13 07:39:31,714 - Group['hadoop'] {}
      2018-02-13 07:39:31,715 - Group['users'] {}
      2018-02-13 07:39:31,715 - Group['knox'] {}
      2018-02-13 07:39:31,716 - User['hive']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,717 - User['zookeeper']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,718 - User['infra-solr']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,719 - User['atlas']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,720 - User['ams']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,721 - User['tez']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}

      2018-02-13 07:39:31,722 - User['zeppelin']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}

      2018-02-13 07:39:31,723 - User['livy']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}

      2018-02-13 07:39:31,724 - User['spark']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}

      2018-02-13 07:39:31,724 - User['ambari-qa']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}

      2018-02-13 07:39:31,725 - User['kafka']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,726 - User['hdfs']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}

      2018-02-13 07:39:31,727 - User['yarn']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,728 - User['mapred']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,729 - User['hbase']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}

      2018-02-13 07:39:31,731 - User['knox']

      {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}

      2018-02-13 07:39:31,732 - File['/var/lib/ambari-agent/tmp/changeUid.sh']

      {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}

      2018-02-13 07:39:31,735 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0']

      {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}

      2018-02-13 07:39:31,744 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
      2018-02-13 07:39:31,744 - Directory['/tmp/hbase-hbase']

      {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}

      2018-02-13 07:39:31,746 - File['/var/lib/ambari-agent/tmp/changeUid.sh']

      {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}

      2018-02-13 07:39:31,748 - File['/var/lib/ambari-agent/tmp/changeUid.sh']

      {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}

      2018-02-13 07:39:31,749 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
      2018-02-13 07:39:31,759 - call returned (0, '1014')
      2018-02-13 07:39:31,760 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014']

      {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}

      2018-02-13 07:39:31,766 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
      2018-02-13 07:39:31,766 - Group['hdfs'] {}
      2018-02-13 07:39:31,767 - User['hdfs']

      {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}

      2018-02-13 07:39:31,767 - FS Type:
      2018-02-13 07:39:31,767 - Directory['/etc/hadoop']

      {'mode': 0755}

      2018-02-13 07:39:31,786 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh']

      {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}

      2018-02-13 07:39:31,787 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir']

      {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}

      2018-02-13 07:39:31,803 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-809', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n

      {% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
      2018-02-13 07:39:31,811 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-809\n\npath=/\nenabled=1\ngpgcheck=0'}
      2018-02-13 07:39:31,812 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
      2018-02-13 07:39:31,813 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n{% if mirror_list %}

      mirrorlist=mirror_list

      {% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
      2018-02-13 07:39:31,816 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-809\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
      2018-02-13 07:39:31,817 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
      2018-02-13 07:39:31,817 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
      2018-02-13 07:39:31,902 - Skipping installation of existing package unzip
      2018-02-13 07:39:31,902 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
      2018-02-13 07:39:31,912 - Skipping installation of existing package curl
      2018-02-13 07:39:31,912 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
      2018-02-13 07:39:31,922 - Skipping installation of existing package hdp-select
      2018-02-13 07:39:32,207 - Looking for matching packages in the following repositories: HDP-3.0-repo-1, HDP-UTILS-1.1.0.21-repo-1
      2018-02-13 07:39:34,268 - Package['hadoop_3_0_0_0_809-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
      2018-02-13 07:39:34,352 - Skipping installation of existing package hadoop_3_0_0_0_809-yarn
      2018-02-13 07:39:34,354 - Package['hadoop_3_0_0_0_809-mapreduce'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
      2018-02-13 07:39:34,363 - Skipping installation of existing package hadoop_3_0_0_0_809-mapreduce
      2018-02-13 07:39:34,365 - Package['hadoop_3_0_0_0_809-hdfs'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
      2018-02-13 07:39:34,380 - Skipping installation of existing package hadoop_3_0_0_0_809-hdfs
      2018-02-13 07:39:34,393 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2018-02-13 07:39:34,394 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
      2018-02-13 07:39:34,394 - call['ambari-python-wrap /usr/bin/hdp-select status hadoop-yarn-resourcemanager'] {'timeout': 20}
      2018-02-13 07:39:34,421 - call returned (0, 'hadoop-yarn-resourcemanager - 3.0.0.0-809')
      2018-02-13 07:39:34,461 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2018-02-13 07:39:34,472 - Execute['umask 0022;wget --no-cookies --no-check-certificate http://public-repo-1.hortonworks.com/ARTIFACTS/dist/hbase/1.2.6/hadoop-2.7.5/hbase-1.2.6-bin.tar.gz && tar -xzf hbase-1.2.6-bin.tar.gz && rm -rf hbase-1.2.6-bin.tar.gz && rm -rf /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase && mv hbase-* /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase'] {'logoutput': True, 'user': 'root'}
      -bash: wget: command not found

      Command failed after 1 tries



      stderr:
      Traceback (most recent call last):
      File "/var/lib/ambari-agent/cache/common-services/YARN/3.0.0.3.0/package/scripts/timelinereader.py", line 101, in <module>
      ApplicationTimelineReader().execute()
      File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in execute
      method(env)
      File "/var/lib/ambari-agent/cache/common-services/YARN/3.0.0.3.0/package/scripts/timelinereader.py", line 45, in install
      hbase_service.install_hbase(env)
      File "/var/lib/ambari-agent/cache/common-services/YARN/3.0.0.3.0/package/scripts/hbase_service.py", line 82, in install_hbase
      Execute(hbase_download_cmd, user="root", logoutput=True)
      File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in _init_
      self.env.run()
      File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
      self.run_action(resource, action)
      File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
      provider_action()
      File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
      tries=self.resource.tries, try_sleep=self.resource.try_sleep)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
      result = function(command, **kwargs)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
      tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
      result = _call(command, **kwargs_copy)
      File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
      raise ExecutionFailed(err_msg, code, out, err)
      resource_management.core.exceptions.ExecutionFailed: Execution of 'umask 0022;wget -no-cookies --no-check-certificate http://public-repo-1.hortonworks.com/ARTIFACTS/dist/hbase/1.2.6/hadoop-2.7.5/hbase-1.2.6-bin.tar.gz && tar -xzf hbase-1.2.6-bin.tar.gz && rm -rf hbase-1.2.6-bin.tar.gz && rm -rf /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase && mv hbase* /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase' returned 127. -bash: wget: command not found
      stdout:
      2018-02-13 07:39:31,705 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
      2018-02-13 07:39:31,711 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2018-02-13 07:39:31,713 - Group['livy'] {}
      2018-02-13 07:39:31,714 - Group['spark'] {}
      2018-02-13 07:39:31,714 - Group['hdfs'] {}
      2018-02-13 07:39:31,714 - Group['zeppelin'] {}
      2018-02-13 07:39:31,714 - Group['hadoop'] {}
      2018-02-13 07:39:31,715 - Group['users'] {}
      2018-02-13 07:39:31,715 - Group['knox'] {}
      2018-02-13 07:39:31,716 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,717 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,718 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,719 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,720 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,721 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
      2018-02-13 07:39:31,722 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
      2018-02-13 07:39:31,723 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
      2018-02-13 07:39:31,724 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
      2018-02-13 07:39:31,724 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
      2018-02-13 07:39:31,725 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,726 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
      2018-02-13 07:39:31,727 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,728 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,729 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2018-02-13 07:39:31,731 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
      2018-02-13 07:39:31,732 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2018-02-13 07:39:31,735 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
      2018-02-13 07:39:31,744 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
      2018-02-13 07:39:31,744 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
      2018-02-13 07:39:31,746 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2018-02-13 07:39:31,748 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2018-02-13 07:39:31,749 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
      2018-02-13 07:39:31,759 - call returned (0, '1014')
      2018-02-13 07:39:31,760 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
      2018-02-13 07:39:31,766 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
      2018-02-13 07:39:31,766 - Group['hdfs'] {}
      2018-02-13 07:39:31,767 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
      2018-02-13 07:39:31,767 - FS Type:
      2018-02-13 07:39:31,767 - Directory['/etc/hadoop'] {'mode': 0755}
      2018-02-13 07:39:31,786 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
      2018-02-13 07:39:31,787 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
      2018-02-13 07:39:31,803 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-809', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n{% if mirror_list %}mirrorlist=mirror_list{% else %}

      baseurl=base_url

      {% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
      2018-02-13 07:39:31,811 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-809\n\npath=/\nenabled=1\ngpgcheck=0'}
      2018-02-13 07:39:31,812 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
      2018-02-13 07:39:31,813 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n{% if mirror_list %}mirrorlist=mirror_list}}{% else %}baseurl={{base_url{% endif %}

      \n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
      2018-02-13 07:39:31,816 - File['/etc/yum.repos.d/ambari-hdp-1.repo']

      {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-809\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}

      2018-02-13 07:39:31,817 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
      2018-02-13 07:39:31,817 - Package['unzip']

      {'retry_on_repo_unavailability': False, 'retry_count': 5}

      2018-02-13 07:39:31,902 - Skipping installation of existing package unzip
      2018-02-13 07:39:31,902 - Package['curl']

      {'retry_on_repo_unavailability': False, 'retry_count': 5}

      2018-02-13 07:39:31,912 - Skipping installation of existing package curl
      2018-02-13 07:39:31,912 - Package['hdp-select']

      {'retry_on_repo_unavailability': False, 'retry_count': 5}

      2018-02-13 07:39:31,922 - Skipping installation of existing package hdp-select
      2018-02-13 07:39:32,207 - Looking for matching packages in the following repositories: HDP-3.0-repo-1, HDP-UTILS-1.1.0.21-repo-1
      2018-02-13 07:39:34,268 - Package['hadoop_3_0_0_0_809-yarn']

      {'retry_on_repo_unavailability': False, 'retry_count': 5}

      2018-02-13 07:39:34,352 - Skipping installation of existing package hadoop_3_0_0_0_809-yarn
      2018-02-13 07:39:34,354 - Package['hadoop_3_0_0_0_809-mapreduce']

      {'retry_on_repo_unavailability': False, 'retry_count': 5}

      2018-02-13 07:39:34,363 - Skipping installation of existing package hadoop_3_0_0_0_809-mapreduce
      2018-02-13 07:39:34,365 - Package['hadoop_3_0_0_0_809-hdfs']

      {'retry_on_repo_unavailability': False, 'retry_count': 5}

      2018-02-13 07:39:34,380 - Skipping installation of existing package hadoop_3_0_0_0_809-hdfs
      2018-02-13 07:39:34,393 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2018-02-13 07:39:34,394 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
      2018-02-13 07:39:34,394 - call['ambari-python-wrap /usr/bin/hdp-select status hadoop-yarn-resourcemanager']

      {'timeout': 20}

      2018-02-13 07:39:34,421 - call returned (0, 'hadoop-yarn-resourcemanager - 3.0.0.0-809')
      2018-02-13 07:39:34,461 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2018-02-13 07:39:34,472 - Execute['umask 0022;wget --no-cookies --no-check-certificate http://public-repo-1.hortonworks.com/ARTIFACTS/dist/hbase/1.2.6/hadoop-2.7.5/hbase-1.2.6-bin.tar.gz && tar -xzf hbase-1.2.6-bin.tar.gz && rm -rf hbase-1.2.6-bin.tar.gz && rm -rf /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase && mv hbase-* /usr/hdp/3.0.0.0-809/hadoop-yarn-hbase']

      {'logoutput': True, 'user': 'root'}

      -bash: wget: command not found

      Command failed after 1 tries

      Attachments

        1. AMBARI-23000.patch
          2 kB
          Andrew Onischuk
        2. AMBARI-23000.patch
          2 kB
          Andrew Onischuk
        3. AMBARI-23000.patch
          2 kB
          Andrew Onischuk
        4. AMBARI-23000.patch
          2 kB
          Andrew Onischuk
        5. AMBARI-23000.patch
          2 kB
          Andrew Onischuk

        Issue Links

          Activity

            People

              aonishuk Andrew Onischuk
              aonishuk Andrew Onischuk
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 1h
                  1h