Hadoop-hdfs depends on hadoop-mapreduce for testing, hence, a cycle.
Right now I am playing tricks with symlinks to hook up the lib directories, so what I build in one dir is automatically picked up by the adjacent project
and documenting what I am doing for the hadoop wiki but its a bit complex.
- flatten : pull out hadoop-hdfs.run-test-hdfs-with-mr bit and move to a subproject that depends on hadoop-hdfs and hadoop-mapreduce
- boostrap via the central repository. Rather than have copies of artifacts in the different bits of SVN, stick some alpha releases of everything up onto the central repository. Then you can use ivy to pull things in, so when I build hdfs the latest version of common gets picked up, and the latest version of mapreduce. If I publish locally, I get the version I ask for, but the default would be to get the last release on the central repo.
I'm coming round in favour of #2, because it helps us debug the publishing process with, say, a fortnightly alpha release of the artifacts (PMC approval still needed, incidentally), so that when the time comes to do real beta releases, the POMs and such like are stable.