1. I don't recall anything about it. I usually make sure all dependencies are Apache compliant and the one we package have the right inter-dependencies set up (ie. they don't ship their own hadoop jars). Beyond that, the dependencies a project pick are the responsibility of that project.
2. I usually make sure each project does not ship its own zookeeper jar. So I expect to find a symlink to zookeeper jars in that flume package. If not, please open a ticket.
3. If I understand correctly your question, this is something we can help projects with (helping projects shipping with ASF only projects and dependencies of the same version), but this is outside of our control. As a matter of policy, we don't patch any upstream tarball. So we can't override dependencies if projects don't provide such feature. Also as an example, I see Apache Hadoop pulling jars such as clover, guava, guice, hsqldb and protocolbuffer. None of these dependencies are under the ASF, and Apache Hadoop would unlikely work if we strip its resulting build from any non-ASF jar. From my point of view, this is a non-issue. The goal of Apache Bigtop as I see it is to provide a point of integration for all ASF compliant projects related to Apache Hadoop. So I would not have any issue providing packaging, tests and deployment recipes for ASF-compliant projects. But I do not represent the community.
4. We don't patch anything but we can still provide alternative implementations. In this case, it was done for packaging/practical reasons more than going around CDH hadoop.
Are you trying to use flume-0.9.3? Why not using Apache Flume (incubating) 0.1.0 instead?