Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
None
-
None
-
None
-
None
-
CentOS 6.6
Description
2015-06-12 07:15:59,122 - u"Execute['('chmod', 'a+x', u'/usr/jdk64')']"
{'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'sudo': True}2015-06-12 07:15:59,166 - Skipping u"Execute['('chmod', 'a+x', u'/usr/jdk64')']" due to not_if
2015-06-12 07:15:59,166 - u"Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64']"
2015-06-12 07:15:59,209 - Skipping u"Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64']" due to not_if
2015-06-12 07:15:59,209 - u"Execute['('chgrp', '-R', u'hadoop', u'/usr/jdk64/jdk1.7.0_67')']"
2015-06-12 07:15:59,293 - u"Execute['('chown', '-R', 'root', u'/usr/jdk64/jdk1.7.0_67')']"
{'sudo': True}2015-06-12 07:15:59,529 - u"Package['spark_2_2_*']" {}
2015-06-12 07:16:00,193 - Installing package spark_2_2_* ('/usr/bin/yum -d 0 -e 0 -y install 'spark_2_2_*'')
2015-06-12 07:16:01,115 - Error while executing command 'install':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py", line 42, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 289, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in _init_
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/_init_.py", line 43, in action_install
self.install_package(package_name, self.resource.use_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
return function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 82, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout, path, sudo, on_new_line)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 199, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install 'spark_2_2_*'' returned 1. Error: Nothing to do
2015-06-12 07:16:01,159 - Could not determine HDP version for component spark-client by calling '/usr/bin/hdp-select status spark-client > /tmp/tmpoZBbY4'. Return Code: 1, Output: ERROR: Invalid package - spark-client
Packages:
accumulo-client
accumulo-gc
accumulo-master
accumulo-monitor
accumulo-tablet
accumulo-tracer
falcon-client
falcon-server
flume-server
hadoop-client
hadoop-hdfs-datanode
hadoop-hdfs-journalnode
hadoop-hdfs-namenode
hadoop-hdfs-nfs3
hadoop-hdfs-portmap
hadoop-hdfs-secondarynamenode
hadoop-mapreduce-historyserver
hadoop-yarn-nodemanager
hadoop-yarn-resourcemanager
hadoop-yarn-timelineserver
hbase-client
hbase-master
hbase-regionserver
hive-metastore
hive-server2
hive-webhcat
kafka-broker
knox-server
mahout-client
oozie-client
oozie-server
phoenix-client
ranger-admin
ranger-usersync
slider-client
sqoop-client
sqoop-server
storm-client
storm-nimbus
storm-slider-client
storm-supervisor
zookeeper-client
zookeeper-server
Aliases:
accumulo-server
all
client
hadoop-hdfs-server
hadoop-mapreduce-server
hadoop-yarn-server
hive-server