Details
-
Bug
-
Status: Resolved
-
Blocker
-
Resolution: Fixed
-
None
-
None
-
None
Description
With HDP 2.2, Ambari needs to copy the tarballs/jars from the local file system to a certain location in HDFS.
The tarballs/jars no longer have a version number (either component version or HDP stack version + build) in the name), but the destination folder in HDFS does contain the HDP Version (e.g., 2.2.0.0-999).
/hdp/apps/$(hdp-stack-version) |---- mapreduce/mapreduce.tar.gz |---- mapreduce/hadoop-streaming.jar (which is needed by WebHcat. In the file system, it is a symlink to a versioned file, so HDFS needs to follow the link) |---- tez/tez.tar.gz |---- pig/pig.tar.gz |---- hive/hive.tar.gz |---- sqoop/sqoop.tar.gz
Furthermore, the folders created in HDFS need to have a permission of 0555, while files need 0444.
The owner should be hdfs, and the group should be hadoop.
Attachments
Attachments
Issue Links
- blocks
-
AMBARI-7844 Ambari needs to fix client configuration files to point to versioned tarballs
- Resolved
- is related to
-
AMBARI-13383 Ambari to install/manage Slider tarball to HDFS
- Resolved
- relates to
-
AMBARI-7825 Rolling Upgrades - hdfs:///apps/tez/tez.tar.gz needs to be versioned
- Resolved
- requires
-
AMBARI-7892 WebHCat to support versioned rpms in Ambari
- Resolved
- links to