Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
I'm seeing the following error during HDP 3 beta installation.
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/_init_.py", line 268, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-release' returned 1. Error: Nothing to do
The above exception was the cause of the following exception:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/3.0.0.3.0/package/scripts/mysql_server.py", line 68, in <module>
MysqlServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/3.0.0.3.0/package/scripts/mysql_server.py", line 37, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 831, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in _init_
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/_init_.py", line 53, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 251, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/_init_.py", line 251, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/_init_.py", line 268, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-release' returned 1. Error: Nothing to do
stdout: /var/lib/ambari-agent/data/output-49.txt
2018-02-16 18:25:54,593 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-02-16 18:25:54,605 - Using hadoop conf dir: /usr/hdp/3.0.0.0-814/hadoop/conf
2018-02-16 18:25:54,608 - Group['livy'] {}
2018-02-16 18:25:54,609 - Group['spark'] {}
2018-02-16 18:25:54,610 - Group['hdfs'] {}
2018-02-16 18:25:54,610 - Group['zeppelin'] {}
2018-02-16 18:25:54,610 - Group['hadoop'] {}
2018-02-16 18:25:54,611 - Group['users'] {}
2018-02-16 18:25:54,611 - Group['knox'] {}
2018-02-16 18:25:54,612 - User['hive']
2018-02-16 18:25:54,614 - User['zookeeper']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,615 - User['infra-solr']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,617 - User['atlas']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,618 - User['ams']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,620 - User['tez']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}2018-02-16 18:25:54,621 - User['zeppelin']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}2018-02-16 18:25:54,623 - User['livy']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}2018-02-16 18:25:54,624 - User['spark']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}2018-02-16 18:25:54,626 - User['ambari-qa']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}2018-02-16 18:25:54,627 - User['kafka']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,629 - User['hdfs']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}2018-02-16 18:25:54,630 - User['yarn']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,632 - User['mapred']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,633 - User['hbase']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}2018-02-16 18:25:54,635 - User['knox']
{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}2018-02-16 18:25:54,636 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}2018-02-16 18:25:54,639 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0']
{'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-02-16 18:25:54,650 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-02-16 18:25:54,651 - Directory['/tmp/hbase-hbase']
2018-02-16 18:25:54,653 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}2018-02-16 18:25:54,657 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-02-16 18:25:54,658 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-02-16 18:25:54,672 - call returned (0, '1015')
2018-02-16 18:25:54,674 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015']
2018-02-16 18:25:54,683 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if
2018-02-16 18:25:54,684 - Group['hdfs'] {}
2018-02-16 18:25:54,685 - User['hdfs']
2018-02-16 18:25:54,687 - FS Type:
2018-02-16 18:25:54,687 - Directory['/etc/hadoop']
2018-02-16 18:25:54,731 - File['/usr/hdp/3.0.0.0-814/hadoop/conf/hadoop-env.sh']
{'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-02-16 18:25:54,732 - Writing File['/usr/hdp/3.0.0.0-814/hadoop/conf/hadoop-env.sh'] because contents don't match
2018-02-16 18:25:54,733 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir']
2018-02-16 18:25:54,771 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-814', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n
{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}2018-02-16 18:25:54,786 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/3.x/BUILDS/3.0.0.0-814\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-16 18:25:54,787 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-02-16 18:25:54,788 - Repository with url http://s3.amazonaws.com/dev.hortonworks.com/HDP-GPL/centos7/3.x/BUILDS/3.0.0.0-814 is not created due to its tags: set([u'GPL'])
2018-02-16 18:25:54,788 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname=repo_id\n{% if mirror_list %}
mirrorlist=mirror_list
{% else %}baseurl=base_url
{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-02-16 18:25:54,796 - File['/etc/yum.repos.d/ambari-hdp-1.repo']
2018-02-16 18:25:54,796 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-02-16 18:25:54,798 - Package['unzip']
2018-02-16 18:25:54,943 - Skipping installation of existing package unzip
2018-02-16 18:25:54,944 - Package['curl']
2018-02-16 18:25:54,960 - Skipping installation of existing package curl
2018-02-16 18:25:54,961 - Package['hdp-select']
2018-02-16 18:25:54,976 - Skipping installation of existing package hdp-select
2018-02-16 18:25:54,995 - Skipping stack-select on MYSQL_SERVER because it does not exist in the stack-select package structure.
2018-02-16 18:25:55,435 - Using hadoop conf dir: /usr/hdp/3.0.0.0-814/hadoop/conf
2018-02-16 18:25:55,471 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2']
2018-02-16 18:25:55,523 - call returned (0, 'hive-server2 - 3.0.0.0-814')
2018-02-16 18:25:55,525 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-02-16 18:25:55,567 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar']
2018-02-16 18:25:55,569 - Not downloading the file from http://will-hdp-1.field.hortonworks.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-02-16 18:25:55,570 - checked_call[('/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', u'/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', u'jceks://file/var/lib/ambari-agent/cred/conf/mysql_server/hive-site.jceks')] {}
2018-02-16 18:25:56,859 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nFeb 16, 2018 6:25:56 PM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nhive')
2018-02-16 18:25:56,873 - Package['mysql-community-release']
2018-02-16 18:25:57,038 - Installing package mysql-community-release ('/usr/bin/yum -d 0 -e 0 -y install mysql-community-release')
2018-02-16 18:25:58,091 - Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-release' returned 1. Error: Nothing to do
2018-02-16 18:25:58,091 - Failed to install package mysql-community-release. Executing '/usr/bin/yum clean metadata'
2018-02-16 18:25:58,386 - Retrying to install package mysql-community-release after 30 seconds
2018-02-16 18:26:37,003 - Skipping stack-select on MYSQL_SERVER because it does not exist in the stack-select package structure.
Command failed after 1 tries