Description
Service Check for WebHCat is failing.
ambari-server-2.4.0.0-4910.x86_64
ambari-server --hash
acfa1c0e7a9b8513eb74747008a43d70728e07bb
Pig tarball was copied to the wrong location
# Expected location [root@c6404 ~]# su hdfs -c "hdfs dfs -ls hdfs:///hdp/apps/2.5.0.0-267" Found 3 items dr-xr-xr-x - hdfs hdfs 0 2016-04-25 23:20 hdfs:///hdp/apps/2.5.0.0-267/mapreduce dr-xr-xr-x - hdfs hdfs 0 2016-04-26 20:32 hdfs:///hdp/apps/2.5.0.0-267/slider dr-xr-xr-x - hdfs hdfs 0 2016-04-26 20:32 hdfs:///hdp/apps/2.5.0.0-267/tez # Incorrect location [root@c6404 ~]# su hdfs -c "hdfs dfs -ls hdfs:///HDP/apps/2.5.0.0-267/pig/pig.tar.gz" -r--r--r-- 3 hdfs hadoop 98902307 2016-04-26 20:33 hdfs:///HDP/apps/2.5.0.0-267/pig/pig.tar.gz It does exist on host with Hive Server, [root@c6404 ~]# ls -la /usr/hdp/current/pig-client/pig.tar.gz -rw-r--r-- 1 root root 98902307 Apr 24 14:00 /usr/hdp/current/pig-client/pig.tar.gz
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py", line 167, in <module> HiveServiceCheck().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 248, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py", line 95, in service_check webhcat_service_check() File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service_check.py", line 125, in webhcat_service_check logoutput=True) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh c6404.ambari.apache.org ambari-qa 50111 idtest.ambari-qa.1461702877.77.pig no_keytab false kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"File hdfs:///hdp/apps/2.5.0.0-267/pig/pig.tar.gz does not exist."}http_code <500>
This is because the code in params_linux.py of Hive should be casting to lowercase the stack_name
hive_tar_source = "{0}/{1}/hive/hive.tar.gz".format(stack_root, STACK_VERSION_PATTERN) pig_tar_source = "{0}/{1}/pig/pig.tar.gz".format(stack_root, STACK_VERSION_PATTERN) hive_tar_dest_file = "/{0}/apps/{1}/hive/hive.tar.gz".format(stack_name, STACK_VERSION_PATTERN) pig_tar_dest_file = "/{0}/apps/{1}/pig/pig.tar.gz".format(stack_name, STACK_VERSION_PATTERN) hadoop_streaming_tar_source = "{0}/{1}/hadoop-mapreduce/hadoop-streaming.jar".format(stack_root, STACK_VERSION_PATTERN) sqoop_tar_source = "{0}/{1}/sqoop/sqoop.tar.gz".format(stack_root, STACK_VERSION_PATTERN) hadoop_streaming_tar_dest_dir = "/{0}/apps/{1}/mapreduce/".format(stack_name, STACK_VERSION_PATTERN) sqoop_tar_dest_dir = "/{0}/apps/{1}/sqoop/".format(stack_name, STACK_VERSION_PATTERN)