HBase
  1. HBase
  2. HBASE-8224

Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string

    Details

    • Type: Task Task
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.98.0, 0.95.2
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed
    • Release Note:
      Hide
      When we publish artifacts, we add a -hadoop1 or -hadoop2 to the version to distingush hbase compiled against hadoop1 from that compiled against hadoop2. This issue adds a script that runs against checked out poms to derive poms that have the -hadoop1 or -hadoop2 modification made and they add/remove to the poms the appropriate set of modules to bundle. See refguide for final doc on how to make use of this script building (TODO); refguide will replace the notes here.
      Show
      When we publish artifacts, we add a -hadoop1 or -hadoop2 to the version to distingush hbase compiled against hadoop1 from that compiled against hadoop2. This issue adds a script that runs against checked out poms to derive poms that have the -hadoop1 or -hadoop2 modification made and they add/remove to the poms the appropriate set of modules to bundle. See refguide for final doc on how to make use of this script building (TODO); refguide will replace the notes here.

      Description

      So we can publish both the hadoop1 and the hadoop2 jars to a maven repository, and so we can publish two packages, one for hadoop1 and one for hadoop2, given how maven works, our only alternative (to the best of my knowledge and after consulting others) is by amending the version string to include hadoop1 or hadoop2.

      1. hbase-8224-proto1.patch
        2 kB
        Enis Soztutar
      2. 8224v5.txt
        53 kB
        stack
      3. 8224-adding.classifiers.txt
        2 kB
        stack
      4. 8224.gen.scriptv3.txt
        53 kB
        stack
      5. 8224.gen.scriptv3.txt
        52 kB
        stack
      6. 8224.gen.script.txt
        28 kB
        stack

        Issue Links

          Activity

          Hide
          Hudson added a comment -

          SUCCESS: Integrated in HBase-TRUNK-on-Hadoop-2.0.0 #656 (See https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-2.0.0/656/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string; ADD MISSED FILE (stack: rev 1511444)

          • /hbase/trunk/dev-support/generate-hadoopX-poms.sh
          Show
          Hudson added a comment - SUCCESS: Integrated in HBase-TRUNK-on-Hadoop-2.0.0 #656 (See https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-2.0.0/656/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string; ADD MISSED FILE (stack: rev 1511444) /hbase/trunk/dev-support/generate-hadoopX-poms.sh
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in hbase-0.95-on-hadoop2 #223 (See https://builds.apache.org/job/hbase-0.95-on-hadoop2/223/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1511443)

          • /hbase/branches/0.95/dev-support/generate-hadoopX-poms.sh
          Show
          Hudson added a comment - SUCCESS: Integrated in hbase-0.95-on-hadoop2 #223 (See https://builds.apache.org/job/hbase-0.95-on-hadoop2/223/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1511443) /hbase/branches/0.95/dev-support/generate-hadoopX-poms.sh
          Hide
          stack added a comment -

          Reclosing after adding missing file and filling in the release note. Reopen again if I have it wrong.

          Show
          stack added a comment - Reclosing after adding missing file and filling in the release note. Reopen again if I have it wrong.
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in hbase-0.95 #413 (See https://builds.apache.org/job/hbase-0.95/413/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1511443)

          • /hbase/branches/0.95/dev-support/generate-hadoopX-poms.sh
          Show
          Hudson added a comment - SUCCESS: Integrated in hbase-0.95 #413 (See https://builds.apache.org/job/hbase-0.95/413/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1511443) /hbase/branches/0.95/dev-support/generate-hadoopX-poms.sh
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in HBase-TRUNK #4352 (See https://builds.apache.org/job/HBase-TRUNK/4352/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string; ADD MISSED FILE (stack: rev 1511444)

          • /hbase/trunk/dev-support/generate-hadoopX-poms.sh
          Show
          Hudson added a comment - SUCCESS: Integrated in HBase-TRUNK #4352 (See https://builds.apache.org/job/HBase-TRUNK/4352/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string; ADD MISSED FILE (stack: rev 1511444) /hbase/trunk/dev-support/generate-hadoopX-poms.sh
          Hide
          stack added a comment -

          Jonathan Hsieh Sorry about that. Fixed. Let me add release note too (doc is to follow when I exercise first release using this script).

          Show
          stack added a comment - Jonathan Hsieh Sorry about that. Fixed. Let me add release note too (doc is to follow when I exercise first release using this script).
          Hide
          Sergey Shelukhin added a comment -

          Do you guys want to populate release notes, so that the potential consumers of these who were not involved in the JIRA could understand what changed? Thanks

          Show
          Sergey Shelukhin added a comment - Do you guys want to populate release notes, so that the potential consumers of these who were not involved in the JIRA could understand what changed? Thanks
          Hide
          Jonathan Hsieh added a comment -

          Hey Stack, the svn commit seems to be missing the dev-support scripts. Can you or someone who tested it commit the correct version?

          Show
          Jonathan Hsieh added a comment - Hey Stack , the svn commit seems to be missing the dev-support scripts. Can you or someone who tested it commit the correct version?
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in hbase-0.95-on-hadoop2 #215 (See https://builds.apache.org/job/hbase-0.95-on-hadoop2/215/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509811)

          • /hbase/branches/0.95/hbase-client/pom.xml
          • /hbase/branches/0.95/hbase-common/pom.xml
          • /hbase/branches/0.95/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java
          • /hbase/branches/0.95/hbase-examples/pom.xml
          • /hbase/branches/0.95/hbase-hadoop1-compat/pom.xml
          • /hbase/branches/0.95/hbase-hadoop2-compat/pom.xml
          • /hbase/branches/0.95/hbase-it/pom.xml
          • /hbase/branches/0.95/hbase-prefix-tree/pom.xml
          • /hbase/branches/0.95/hbase-server/pom.xml
          • /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java
          • /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java
          • /hbase/branches/0.95/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java
          • /hbase/branches/0.95/pom.xml
          Show
          Hudson added a comment - SUCCESS: Integrated in hbase-0.95-on-hadoop2 #215 (See https://builds.apache.org/job/hbase-0.95-on-hadoop2/215/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509811) /hbase/branches/0.95/hbase-client/pom.xml /hbase/branches/0.95/hbase-common/pom.xml /hbase/branches/0.95/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java /hbase/branches/0.95/hbase-examples/pom.xml /hbase/branches/0.95/hbase-hadoop1-compat/pom.xml /hbase/branches/0.95/hbase-hadoop2-compat/pom.xml /hbase/branches/0.95/hbase-it/pom.xml /hbase/branches/0.95/hbase-prefix-tree/pom.xml /hbase/branches/0.95/hbase-server/pom.xml /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java /hbase/branches/0.95/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java /hbase/branches/0.95/pom.xml
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in HBase-TRUNK #4334 (See https://builds.apache.org/job/HBase-TRUNK/4334/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509813)

          • /hbase/trunk/hbase-client/pom.xml
          • /hbase/trunk/hbase-common/pom.xml
          • /hbase/trunk/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java
          • /hbase/trunk/hbase-examples/pom.xml
          • /hbase/trunk/hbase-hadoop1-compat/pom.xml
          • /hbase/trunk/hbase-hadoop2-compat/pom.xml
          • /hbase/trunk/hbase-it/pom.xml
          • /hbase/trunk/hbase-prefix-tree/pom.xml
          • /hbase/trunk/hbase-server/pom.xml
          • /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java
          • /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java
          • /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java
          • /hbase/trunk/pom.xml
          Show
          Hudson added a comment - SUCCESS: Integrated in HBase-TRUNK #4334 (See https://builds.apache.org/job/HBase-TRUNK/4334/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509813) /hbase/trunk/hbase-client/pom.xml /hbase/trunk/hbase-common/pom.xml /hbase/trunk/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java /hbase/trunk/hbase-examples/pom.xml /hbase/trunk/hbase-hadoop1-compat/pom.xml /hbase/trunk/hbase-hadoop2-compat/pom.xml /hbase/trunk/hbase-it/pom.xml /hbase/trunk/hbase-prefix-tree/pom.xml /hbase/trunk/hbase-server/pom.xml /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java /hbase/trunk/pom.xml
          Hide
          Hudson added a comment -

          FAILURE: Integrated in HBase-TRUNK-on-Hadoop-2.0.0 #649 (See https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-2.0.0/649/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509813)

          • /hbase/trunk/hbase-client/pom.xml
          • /hbase/trunk/hbase-common/pom.xml
          • /hbase/trunk/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java
          • /hbase/trunk/hbase-examples/pom.xml
          • /hbase/trunk/hbase-hadoop1-compat/pom.xml
          • /hbase/trunk/hbase-hadoop2-compat/pom.xml
          • /hbase/trunk/hbase-it/pom.xml
          • /hbase/trunk/hbase-prefix-tree/pom.xml
          • /hbase/trunk/hbase-server/pom.xml
          • /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java
          • /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java
          • /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java
          • /hbase/trunk/pom.xml
          Show
          Hudson added a comment - FAILURE: Integrated in HBase-TRUNK-on-Hadoop-2.0.0 #649 (See https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-2.0.0/649/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509813) /hbase/trunk/hbase-client/pom.xml /hbase/trunk/hbase-common/pom.xml /hbase/trunk/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java /hbase/trunk/hbase-examples/pom.xml /hbase/trunk/hbase-hadoop1-compat/pom.xml /hbase/trunk/hbase-hadoop2-compat/pom.xml /hbase/trunk/hbase-it/pom.xml /hbase/trunk/hbase-prefix-tree/pom.xml /hbase/trunk/hbase-server/pom.xml /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java /hbase/trunk/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java /hbase/trunk/pom.xml
          Hide
          Hudson added a comment -

          SUCCESS: Integrated in hbase-0.95 #396 (See https://builds.apache.org/job/hbase-0.95/396/)
          HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509811)

          • /hbase/branches/0.95/hbase-client/pom.xml
          • /hbase/branches/0.95/hbase-common/pom.xml
          • /hbase/branches/0.95/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java
          • /hbase/branches/0.95/hbase-examples/pom.xml
          • /hbase/branches/0.95/hbase-hadoop1-compat/pom.xml
          • /hbase/branches/0.95/hbase-hadoop2-compat/pom.xml
          • /hbase/branches/0.95/hbase-it/pom.xml
          • /hbase/branches/0.95/hbase-prefix-tree/pom.xml
          • /hbase/branches/0.95/hbase-server/pom.xml
          • /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java
          • /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java
          • /hbase/branches/0.95/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java
          • /hbase/branches/0.95/pom.xml
          Show
          Hudson added a comment - SUCCESS: Integrated in hbase-0.95 #396 (See https://builds.apache.org/job/hbase-0.95/396/ ) HBASE-8224 Publish hbase build against h1 and h2 adding '-hadoop1' or '-hadoop2' to version string (stack: rev 1509811) /hbase/branches/0.95/hbase-client/pom.xml /hbase/branches/0.95/hbase-common/pom.xml /hbase/branches/0.95/hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java /hbase/branches/0.95/hbase-examples/pom.xml /hbase/branches/0.95/hbase-hadoop1-compat/pom.xml /hbase/branches/0.95/hbase-hadoop2-compat/pom.xml /hbase/branches/0.95/hbase-it/pom.xml /hbase/branches/0.95/hbase-prefix-tree/pom.xml /hbase/branches/0.95/hbase-server/pom.xml /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/constraint/package-info.java /hbase/branches/0.95/hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java /hbase/branches/0.95/hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java /hbase/branches/0.95/pom.xml
          Hide
          stack added a comment -

          Thanks for the reviews lads and for trying out the product. mvn release struggle is still an issue and I'll update the refguide as I run 0.95.2 so it is informed by an actual release.

          As to doing same for 0.94, I commented over on the 0.94-specific issue. It should be easier there to do same thing since not modularized.

          On packaging for bigtop, this should help. Lets open new issue for any specifics needed.

          Show
          stack added a comment - Thanks for the reviews lads and for trying out the product. mvn release struggle is still an issue and I'll update the refguide as I run 0.95.2 so it is informed by an actual release. As to doing same for 0.94, I commented over on the 0.94-specific issue. It should be easier there to do same thing since not modularized. On packaging for bigtop, this should help. Lets open new issue for any specifics needed.
          Hide
          Nick Dimiduk added a comment -

          Green lights from me. I don't know the release plugin well enough to test that side out. Also, I saw your earlier comment about the other changes, so don't mind me.

          +1.

          Show
          Nick Dimiduk added a comment - Green lights from me. I don't know the release plugin well enough to test that side out. Also, I saw your earlier comment about the other changes, so don't mind me. +1.
          Hide
          Nick Dimiduk added a comment -

          I'm giving this a spin too. Will let you know what I find.

          What's with sneaking in the unrelated formatting changes in the v5 patch?

          Show
          Nick Dimiduk added a comment - I'm giving this a spin too. Will let you know what I find. What's with sneaking in the unrelated formatting changes in the v5 patch?
          Hide
          Brock Noland added a comment -

          Stack yes it does seem to be working! Thx!!

          Show
          Brock Noland added a comment - Stack yes it does seem to be working! Thx!!
          Hide
          Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12595528/8224v5.txt
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 3 new or modified tests.

          +1 hadoop1.0. The patch compiles against the hadoop 1.0 profile.

          +1 javadoc. The javadoc tool did not generate any warning messages.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 lineLengths. The patch does not introduce lines longer than 100

          +1 site. The mvn site goal succeeds with this patch.

          +1 core tests. The patch passed unit tests in .

          Test results: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//testReport/
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-protocol.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-client.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-examples.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop1-compat.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-prefix-tree.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-common.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-server.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop-compat.html
          Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12595528/8224v5.txt against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 3 new or modified tests. +1 hadoop1.0 . The patch compiles against the hadoop 1.0 profile. +1 javadoc . The javadoc tool did not generate any warning messages. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 lineLengths . The patch does not introduce lines longer than 100 +1 site . The mvn site goal succeeds with this patch. +1 core tests . The patch passed unit tests in . Test results: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-protocol.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-client.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-examples.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop1-compat.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-prefix-tree.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-common.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-server.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop-compat.html Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/6563//console This message is automatically generated.
          Hide
          stack added a comment -

          Brock Noland Do the SNAPSHOTs work for you boss?

          Show
          stack added a comment - Brock Noland Do the SNAPSHOTs work for you boss?
          Hide
          stack added a comment -

          Same as v4 only have the script write into the generated pom, the name of the pom itself.

          Show
          stack added a comment - Same as v4 only have the script write into the generated pom, the name of the pom itself.
          Hide
          stack added a comment -

          Played w/ maven release plugin. Updating my patch (I have to set into the pom, the name of the pom itself!!!!! for the release plugin). Basically works. Making a release, I'll operate on a copy of the release branch. I'll need to replace the pom.xml w/ the pom.xml.hadoop1 or pom.xml.hadoop2 and check them into the tag (else release plugin's checkin of the updated release version on the end of the release:prepare fails – we'll see). Here are commands I used doing clean, prepare, and perform. Will add the below to updated doc in refguide.

           2246  ~/bin/mvn/bin/mvn -f pom.xml.hadoop2  release:clean -Dproject.scm.developerConnection="scm:svn:https://svn.apache.org/repos/asf/hbase/branches/testing_remove" -DtagBase="https://svn.apache.org/repos/asf/hbase/tags" -DreleaseVersion=0.95.2-hadoop2 -DremoteTagging=false -DsuppressCommitBeforeTag=true  -DautoVersionSubmodules=true
           2251  ~/bin/mvn/bin/mvn -f pom.xml.hadoop2  release:prepare -Dproject.scm.developerConnection="scm:svn:https://svn.apache.org/repos/asf/hbase/branches/testing_remove" -DtagBase="https://svn.apache.org/repos/asf/hbase/tags" -DreleaseVersion=0.95.2-hadoop2 -DremoteTagging=false -DsuppressCommitBeforeTag=true  -DautoVersionSubmodules=true -DcheckModificationExcludeList="**pom.xml.*"
           2252  ~/bin/mvn/bin/mvn -f pom.xml.hadoop2  release:perform -Dproject.scm.developerConnection="scm:svn:https://svn.apache.org/repos/asf/hbase/branches/testing_remove" -DtagBase="https://svn.apache.org/repos/asf/hbase/tags" -DreleaseVersion=0.95.2-hadoop2 -DremoteTagging=false -DsuppressCommitBeforeTag=true  -DautoVersionSubmodules=true -DcheckModificationExcludeList="**pom.xml.*"
          

          Some of the above config is redundant. TBD.

          I'd like to check in what I have so far.

          Show
          stack added a comment - Played w/ maven release plugin. Updating my patch (I have to set into the pom, the name of the pom itself!!!!! for the release plugin). Basically works. Making a release, I'll operate on a copy of the release branch. I'll need to replace the pom.xml w/ the pom.xml.hadoop1 or pom.xml.hadoop2 and check them into the tag (else release plugin's checkin of the updated release version on the end of the release:prepare fails – we'll see). Here are commands I used doing clean, prepare, and perform. Will add the below to updated doc in refguide. 2246 ~/bin/mvn/bin/mvn -f pom.xml.hadoop2 release:clean -Dproject.scm.developerConnection= "scm:svn:https: //svn.apache.org/repos/asf/hbase/branches/testing_remove" -DtagBase= "https://svn.apache.org/repos/asf/hbase/tags" -DreleaseVersion=0.95.2-hadoop2 -DremoteTagging= false -DsuppressCommitBeforeTag= true -DautoVersionSubmodules= true 2251 ~/bin/mvn/bin/mvn -f pom.xml.hadoop2 release:prepare -Dproject.scm.developerConnection= "scm:svn:https: //svn.apache.org/repos/asf/hbase/branches/testing_remove" -DtagBase= "https://svn.apache.org/repos/asf/hbase/tags" -DreleaseVersion=0.95.2-hadoop2 -DremoteTagging= false -DsuppressCommitBeforeTag= true -DautoVersionSubmodules= true -DcheckModificationExcludeList= "**pom.xml.*" 2252 ~/bin/mvn/bin/mvn -f pom.xml.hadoop2 release:perform -Dproject.scm.developerConnection= "scm:svn:https: //svn.apache.org/repos/asf/hbase/branches/testing_remove" -DtagBase= "https://svn.apache.org/repos/asf/hbase/tags" -DreleaseVersion=0.95.2-hadoop2 -DremoteTagging= false -DsuppressCommitBeforeTag= true -DautoVersionSubmodules= true -DcheckModificationExcludeList= "**pom.xml.*" Some of the above config is redundant. TBD. I'd like to check in what I have so far.
          Hide
          Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12595489/8224.gen.scriptv3.txt
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 3 new or modified tests.

          +1 hadoop1.0. The patch compiles against the hadoop 1.0 profile.

          +1 javadoc. The javadoc tool did not generate any warning messages.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 lineLengths. The patch does not introduce lines longer than 100

          +1 site. The mvn site goal succeeds with this patch.

          +1 core tests. The patch passed unit tests in .

          Test results: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//testReport/
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-protocol.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-client.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-examples.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop1-compat.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-prefix-tree.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-common.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-server.html
          Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop-compat.html
          Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12595489/8224.gen.scriptv3.txt against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 3 new or modified tests. +1 hadoop1.0 . The patch compiles against the hadoop 1.0 profile. +1 javadoc . The javadoc tool did not generate any warning messages. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 lineLengths . The patch does not introduce lines longer than 100 +1 site . The mvn site goal succeeds with this patch. +1 core tests . The patch passed unit tests in . Test results: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-protocol.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-client.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-examples.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop1-compat.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-prefix-tree.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-common.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-server.html Findbugs warnings: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//artifact/trunk/patchprocess/newPatchFindbugsWarningshbase-hadoop-compat.html Console output: https://builds.apache.org/job/PreCommit-HBASE-Build/6560//console This message is automatically generated.
          Hide
          stack added a comment -

          I tested building assemblies. They seem to work. Now let me try and use the mvn release plugin. See what it thinks.

          Show
          stack added a comment - I tested building assemblies. They seem to work. Now let me try and use the mvn release plugin. See what it thinks.
          Hide
          stack added a comment -

          Trying against hadoopqa.

          Show
          stack added a comment - Trying against hadoopqa.
          Hide
          stack added a comment -

          Adds script to generate hadoop1 and hadoop2 poms that have modules
          and profiles appropriately set.

          Also cleaned up our dependency settings after review of dependency:tree
          and dependency:analyze w/ hadoop1 and hadoop2 setups.

          Purged our use of slf4j. We don't need it.

          A dev-support/generate-hadoopX-poms.sh
          Script to generate pom.xml.hadoop1 or pom.xml.hadoop2 everywhere
          which we pass to maven w/ -f flag when we want to publish coherent
          hbase-hadoop1 and hbase-hadoop2.

          M hbase-client/pom.xml
          Set marker string under hadoop1.1 profile. Ditto for hadoop2 profile
          for use by above script so it can find where to set profiles.

          Declare depenencies we were using anyways.

          M hbase-common/pom.xml
          M hbase-examples/pom.xml
          M hbase-prefix-tree/pom.xml
          M hbase-it/pom.xml
          M hbase-server/pom.xml
          Purge unused slf4js.

          Declare depenencies we were using anyways.

          Set marker string under hadoop1.1 profile. Ditto for hadoop2 profile
          for use by above script so it can find where to set profiles.

          M hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java
          Remove unwarranted user of slf4j. Use our usual logging instead.

          M hbase-hadoop1-compat/pom.xml
          M hbase-hadoop2-compat/pom.xml
          Moved the dependency up into parent pom rather than have it repeat
          twice, once here and once in hadoop2-compat.

          Declare depenencies we were using anyways.

          M hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java
          M hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java
          Use commons logging instead of slf4j.

          M pom.xml
          Added to dependencymanagement libs we were using already but undeclared.
          Removed our depnding on an explicit slf4j version – let the transitive includes
          sort it out since we don't use it anymore.

          Declare depenencies we were using anyways.

          Add some excludes for stuff we don't need.

          Set marker string under hadoop1.1 profile. Ditto for hadoop2 profile
          for use by above script so it can find where to set profiles.

          Show
          stack added a comment - Adds script to generate hadoop1 and hadoop2 poms that have modules and profiles appropriately set. Also cleaned up our dependency settings after review of dependency:tree and dependency:analyze w/ hadoop1 and hadoop2 setups. Purged our use of slf4j. We don't need it. A dev-support/generate-hadoopX-poms.sh Script to generate pom.xml.hadoop1 or pom.xml.hadoop2 everywhere which we pass to maven w/ -f flag when we want to publish coherent hbase-hadoop1 and hbase-hadoop2. M hbase-client/pom.xml Set marker string under hadoop1.1 profile. Ditto for hadoop2 profile for use by above script so it can find where to set profiles. Declare depenencies we were using anyways. M hbase-common/pom.xml M hbase-examples/pom.xml M hbase-prefix-tree/pom.xml M hbase-it/pom.xml M hbase-server/pom.xml Purge unused slf4js. Declare depenencies we were using anyways. Set marker string under hadoop1.1 profile. Ditto for hadoop2 profile for use by above script so it can find where to set profiles. M hbase-common/src/main/java/org/apache/hadoop/hbase/util/JVM.java Remove unwarranted user of slf4j. Use our usual logging instead. M hbase-hadoop1-compat/pom.xml M hbase-hadoop2-compat/pom.xml Moved the dependency up into parent pom rather than have it repeat twice, once here and once in hadoop2-compat. Declare depenencies we were using anyways. M hbase-server/src/main/java/org/apache/hadoop/hbase/thrift/HThreadedSelectorServerArgs.java M hbase-server/src/test/java/org/apache/hadoop/hbase/master/TestMasterNoCluster.java Use commons logging instead of slf4j. M pom.xml Added to dependencymanagement libs we were using already but undeclared. Removed our depnding on an explicit slf4j version – let the transitive includes sort it out since we don't use it anymore. Declare depenencies we were using anyways. Add some excludes for stuff we don't need. Set marker string under hadoop1.1 profile. Ditto for hadoop2 profile for use by above script so it can find where to set profiles.
          Hide
          Brock Noland added a comment -

          Ignore that last comment. Wrong JIRA

          Show
          Brock Noland added a comment - Ignore that last comment. Wrong JIRA
          Hide
          Brock Noland added a comment -

          The patch is not complete, still cleanup and some items pending HBASE-8224, but reviewers can provide feedback on the meat of the patch.

          https://reviews.facebook.net/D11913

          Show
          Brock Noland added a comment - The patch is not complete, still cleanup and some items pending HBASE-8224 , but reviewers can provide feedback on the meat of the patch. https://reviews.facebook.net/D11913
          Hide
          stack added a comment -

          I updated my downstreamer w/ notes on dependencies. Some should be coming in transitively and actually are if I poke w/ dependency:tree (e.g. hbase-hadoop-compat) but building there is strange metrics fail if I don't include explicitly. The poms have notes on the listed dependencies. See https://github.com/saintstack/hbase-downstreamer

          Show
          stack added a comment - I updated my downstreamer w/ notes on dependencies. Some should be coming in transitively and actually are if I poke w/ dependency:tree (e.g. hbase-hadoop-compat) but building there is strange metrics fail if I don't include explicitly. The poms have notes on the listed dependencies. See https://github.com/saintstack/hbase-downstreamer
          Hide
          stack added a comment -

          This patch includes latest over on hbase-8488 so ignore the changes to do w/ dependency inclusion and purge of slf4j for now (I'll fix up the patch later).

          Patch adds a script into dev-support called generate-hadoopX-poms.sh

          Run the script to generate a hadoop1 or hadoop2 pom from the original pom. The new pom shows up to the side of the original. Next build telling mvn to use this new pom. This seems to generate artifacts that are w/o pollution: i.e. no need for downstreamer to add a -Dhadoop.profile=2.0, etc.

          I used this script to deploy new hbase-hadoop1 and hbase-hadoop2 snapshots up at https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase/ (Brock Noland – you have a chance to take a look at them?).

          Here is roughly what you do to build artifacts to publish:

          $ bash -x ./dev-support/generate-hadoopX-poms.sh 0.95.2-SNAPSHOT 0.95.2-hadoop1-SNAPSHOT
          $ $ mvn clean install deploy -DskipTests -Pgpg -f pom.xml.hadoop1

          The head of the script has more on how it works.

          I'll write it up better in the manual after I've played some more (need to look at assembly's and at maven release).

          Show
          stack added a comment - This patch includes latest over on hbase-8488 so ignore the changes to do w/ dependency inclusion and purge of slf4j for now (I'll fix up the patch later). Patch adds a script into dev-support called generate-hadoopX-poms.sh Run the script to generate a hadoop1 or hadoop2 pom from the original pom. The new pom shows up to the side of the original. Next build telling mvn to use this new pom. This seems to generate artifacts that are w/o pollution: i.e. no need for downstreamer to add a -Dhadoop.profile=2.0, etc. I used this script to deploy new hbase-hadoop1 and hbase-hadoop2 snapshots up at https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase/ ( Brock Noland – you have a chance to take a look at them?). Here is roughly what you do to build artifacts to publish: $ bash -x ./dev-support/generate-hadoopX-poms.sh 0.95.2-SNAPSHOT 0.95.2-hadoop1-SNAPSHOT $ $ mvn clean install deploy -DskipTests -Pgpg -f pom.xml.hadoop1 The head of the script has more on how it works. I'll write it up better in the manual after I've played some more (need to look at assembly's and at maven release).
          Hide
          stack added a comment -

          Nick Dimiduk Short answer, yes. Our published poms have uninterpreted variables and a selection of profiles to choose defaulting to a hadoop1 if non-chosen. Profiles are how you customize your build. We want to customize the customization.

          Brock Noland Thanks boss. Let me see if can make it so default profile matches the -hadoop1 or -hadoop2 ending.

          Enis Soztutar Yeah, that is the road I took last night using a script to generate pom.xml.hadoop1 and pom.xml.hadoop2 then building passing the particular pom w/ the -f flag. Tricky parts are the fact that modules refer to parent and they also need to run the custom pom, there are lots of changes to make, and we are polluting our src tree. Working on it.

          Show
          stack added a comment - Nick Dimiduk Short answer, yes. Our published poms have uninterpreted variables and a selection of profiles to choose defaulting to a hadoop1 if non-chosen. Profiles are how you customize your build. We want to customize the customization. Brock Noland Thanks boss. Let me see if can make it so default profile matches the -hadoop1 or -hadoop2 ending. Enis Soztutar Yeah, that is the road I took last night using a script to generate pom.xml.hadoop1 and pom.xml.hadoop2 then building passing the particular pom w/ the -f flag. Tricky parts are the fact that modules refer to parent and they also need to run the custom pom, there are lots of changes to make, and we are polluting our src tree. Working on it.
          Hide
          Enis Soztutar added a comment -

          The other approach would be to have two different poms for hadoop1 and hadoop2. There will be some duplication, but it will be clean in terms of profiles and dependencies.

          Show
          Enis Soztutar added a comment - The other approach would be to have two different poms for hadoop1 and hadoop2. There will be some duplication, but it will be clean in terms of profiles and dependencies.
          Hide
          Brock Noland added a comment -

          FWIW, from a hive perspective, it looks like if the automatic profile activation was removed it would be easier for us. Today when we ant to build with hadoop2, we need to set a system property which in ant, unlike maven, is not trivial. Since we are using the maven ant plugin I believe we could use artifact:pom:profile to simply activate the profile we'd like at the appropriate time.

          Show
          Brock Noland added a comment - FWIW, from a hive perspective, it looks like if the automatic profile activation was removed it would be easier for us. Today when we ant to build with hadoop2, we need to set a system property which in ant, unlike maven, is not trivial. Since we are using the maven ant plugin I believe we could use artifact:pom:profile to simply activate the profile we'd like at the appropriate time.
          Hide
          Nick Dimiduk added a comment -

          Is it too little too late to ask "why do we do profile activation this way?" Are we using profiles in a way that Maven doesn't intend?

          Show
          Nick Dimiduk added a comment - Is it too little too late to ask "why do we do profile activation this way?" Are we using profiles in a way that Maven doesn't intend?
          Hide
          stack added a comment -

          The profile activation mess is biting us. We will always pull in hadoop1 default in a downstream project even when built w/ custom hadoop2 pom.xml which hard codes our compatibility module at hadoop2, etc., because resultant module poms have two profiles and no profile means default hadoop.profile=1.0.

          Trying to hack about in downstream projects w/ excludes is awful; we shouldn't have downstreamers have to do this (could exlude hadoop1 everywhere and do explicit include of hadoop2 but it is awfully arcane).

          We cannot expect downstreamers specify a profile building: e.g. downstream projects 'work' if we add -Dhadoop.profile=2.0 but that is again asking too much of downstreamers; they should just select the hbase jars they want and we should transitively include (caveat maven transitive include mechanism, especially regards test-jars is random seemingly).

          Given maven poms do not support xinclude and given that pom element duplication is rife already, I think I have to script -hadoop1 and -hadoop2 build/publication (unless anyone has better ideas).

          Show
          stack added a comment - The profile activation mess is biting us. We will always pull in hadoop1 default in a downstream project even when built w/ custom hadoop2 pom.xml which hard codes our compatibility module at hadoop2, etc., because resultant module poms have two profiles and no profile means default hadoop.profile=1.0. Trying to hack about in downstream projects w/ excludes is awful; we shouldn't have downstreamers have to do this (could exlude hadoop1 everywhere and do explicit include of hadoop2 but it is awfully arcane). We cannot expect downstreamers specify a profile building: e.g. downstream projects 'work' if we add -Dhadoop.profile=2.0 but that is again asking too much of downstreamers; they should just select the hbase jars they want and we should transitively include (caveat maven transitive include mechanism, especially regards test-jars is random seemingly). Given maven poms do not support xinclude and given that pom element duplication is rife already, I think I have to script -hadoop1 and -hadoop2 build/publication (unless anyone has better ideas).
          Hide
          stack added a comment -

          So our poms in repo, even if local repo, include the compat.module variable. It is not being interpolated. Here is a build of hbase against hadoop2 profile:

          durruti:hbase stack$ grep -r 'compat.module' .
          ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom:        <artifactId>${compat.module}</artifactId>
          ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom:        <artifactId>${compat.module}</artifactId>
          ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom:        <compat.module>hbase-hadoop1-compat</compat.module>
          ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom:        <!-- Need to set this for the Hadoop 1 compat module -->
          ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom:        <compat.module>hbase-hadoop1-compat</compat.module>
          ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom:        <compat.module>hbase-hadoop2-compat</compat.module>
          ./hbase-assembly/0.95.2-SNAPSHOT/hbase-assembly-0.95.2-SNAPSHOT.pom:        <artifactId>${compat.module}</artifactId>
          ./hbase-assembly/0.95.2-SNAPSHOT/hbase-assembly-0.95.2-SNAPSHOT.pom:        <artifactId>${compat.module}</artifactId>
          ./hbase-it/0.95.2-SNAPSHOT/hbase-it-0.95.2-SNAPSHOT.pom:          <artifactId>${compat.module}</artifactId>
          ./hbase-it/0.95.2-SNAPSHOT/hbase-it-0.95.2-SNAPSHOT.pom:          <artifactId>${compat.module}</artifactId>
          ./hbase-prefix-tree/0.95.2-SNAPSHOT/hbase-prefix-tree-0.95.2-SNAPSHOT.pom:      <artifactId>${compat.module}</artifactId>
          ./hbase-server/0.95.2-SNAPSHOT/hbase-server-0.95.2-SNAPSHOT.pom:      <artifactId>${compat.module}</artifactId>
          ./hbase-server/0.95.2-SNAPSHOT/hbase-server-0.95.2-SNAPSHOT.pom:      <artifactId>${compat.module}</artifactId>
          

          I've been playing w/ replacing the above variable and then building w/ new poms. Here is my little script:

          $ hadoop=hadoop2; for i in `find . -name pom.xml`; do echo $i; fn="pom.xml.${hadoop}"; sed "s/\${compat.module}/hbase-${hadoop}-compat/;s/relativePath>\.\./..\/relativePatch>$fn/;s/\(module>[^<]*\)/\1\/$fn/"  $i > $i.${hadoop}; done
          

          This puts a file beside the original pom.xml named pom.xml.hadoop2. This file has a couple of substitutions done on original pom (I tried to put it into target dir so I did not mod the original srcs but mvn wants the modules under it – and then child modules want to ref the parent – I suppose I could make this work w/ more hackery).

          Now building I do this:

          mvn clean install -DskipTests -Dhadoop.profile=2.0 -f pom.xml.hadoop2

          And repo seems better.

          My downstream project is failing still though it now no longer has hadoop1 on refernces or fail because it wants hadoop1 dependency.

          It is failing here:

            4 -------------------------------------------------------------------------------
            3 Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.67 sec <<< FAILURE!
            2 testSpinUpMiniHBaseCluster(org.hbase.downstreamer.TestHBaseMiniCluster)  Time elapsed: 0.631 sec  <<< ERROR!
            1 java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation
          ...
          

          ... which uusually means I was compiled against wrong version.... Looking.

          Show
          stack added a comment - So our poms in repo, even if local repo, include the compat.module variable. It is not being interpolated. Here is a build of hbase against hadoop2 profile: durruti:hbase stack$ grep -r 'compat.module' . ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom: <compat.module>hbase-hadoop1-compat</compat.module> ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom: <!-- Need to set this for the Hadoop 1 compat module --> ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom: <compat.module>hbase-hadoop1-compat</compat.module> ./hbase/0.95.2-SNAPSHOT/hbase-0.95.2-SNAPSHOT.pom: <compat.module>hbase-hadoop2-compat</compat.module> ./hbase-assembly/0.95.2-SNAPSHOT/hbase-assembly-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase-assembly/0.95.2-SNAPSHOT/hbase-assembly-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase-it/0.95.2-SNAPSHOT/hbase-it-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase-it/0.95.2-SNAPSHOT/hbase-it-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase-prefix-tree/0.95.2-SNAPSHOT/hbase-prefix-tree-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase-server/0.95.2-SNAPSHOT/hbase-server-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> ./hbase-server/0.95.2-SNAPSHOT/hbase-server-0.95.2-SNAPSHOT.pom: <artifactId>${compat.module}</artifactId> I've been playing w/ replacing the above variable and then building w/ new poms. Here is my little script: $ hadoop=hadoop2; for i in `find . -name pom.xml`; do echo $i; fn= "pom.xml.${hadoop}" ; sed "s/\${compat.module}/hbase-${hadoop}-compat/;s/relativePath>\.\./..\/relativePatch>$fn/;s/\(module>[^<]*\)/\1\/$fn/" $i > $i.${hadoop}; done This puts a file beside the original pom.xml named pom.xml.hadoop2. This file has a couple of substitutions done on original pom (I tried to put it into target dir so I did not mod the original srcs but mvn wants the modules under it – and then child modules want to ref the parent – I suppose I could make this work w/ more hackery). Now building I do this: mvn clean install -DskipTests -Dhadoop.profile=2.0 -f pom.xml.hadoop2 And repo seems better. My downstream project is failing still though it now no longer has hadoop1 on refernces or fail because it wants hadoop1 dependency. It is failing here: 4 ------------------------------------------------------------------------------- 3 Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.67 sec <<< FAILURE! 2 testSpinUpMiniHBaseCluster(org.hbase.downstreamer.TestHBaseMiniCluster) Time elapsed: 0.631 sec <<< ERROR! 1 java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation ... ... which uusually means I was compiled against wrong version.... Looking.
          Hide
          Andrew Purtell added a comment -

          We should simultaneously develop Bigtop packaging for HBase 0.96+. There are overlapping issues with what would be convenient for packaging (and user installs) and Maven publishing mechanics. Let's use BIGTOP-1029 and this issue as a bridge.

          Show
          Andrew Purtell added a comment - We should simultaneously develop Bigtop packaging for HBase 0.96+. There are overlapping issues with what would be convenient for packaging (and user installs) and Maven publishing mechanics. Let's use BIGTOP-1029 and this issue as a bridge.
          Hide
          stack added a comment -

          This issue I am using for figuring build against h1 and h2 and then publishing to mvn repo.

          Show
          stack added a comment - This issue I am using for figuring build against h1 and h2 and then publishing to mvn repo.
          Hide
          Brock Noland added a comment -

          Like you did with 0.95.X-SNAPSHOT, I don't suppose it would be possible to publish a hadoop1 and hadoop2 artifact for 0.94.X-SNAPSHOT?

          As hive uses Ivy the pom issues discussed above do not cause us problems, but 0.95 causes compile errors for Hive due to Put/Result not being a Writable and moving around of some exception classes.

          Show
          Brock Noland added a comment - Like you did with 0.95.X-SNAPSHOT, I don't suppose it would be possible to publish a hadoop1 and hadoop2 artifact for 0.94.X-SNAPSHOT? As hive uses Ivy the pom issues discussed above do not cause us problems, but 0.95 causes compile errors for Hive due to Put/Result not being a Writable and moving around of some exception classes.
          Hide
          Brock Noland added a comment -

          I agree that ideally specifying hbase-0.95.1-hadoop2 should be enough. If we can do that it'd be awesome!

          Just throwing out ideas here...in the absence of that solution, maybe we should have no default hadoop.profile so that users of hadoop2 know they must specify -Dhaoop.profile=2.0 while users of hadoop1 must specify -Dhaoop.profile=1.0? It's a slight downgrade in terms of usability for 1.0 users but a step up in terms of usability for hadoop2 users.

          Show
          Brock Noland added a comment - I agree that ideally specifying hbase-0.95.1-hadoop2 should be enough. If we can do that it'd be awesome! Just throwing out ideas here...in the absence of that solution, maybe we should have no default hadoop.profile so that users of hadoop2 know they must specify -Dhaoop.profile=2.0 while users of hadoop1 must specify -Dhaoop.profile=1.0? It's a slight downgrade in terms of usability for 1.0 users but a step up in terms of usability for hadoop2 users.
          Hide
          stack added a comment -

          Brock Noland Good to know. Thank you for trying. Please explain more what you mean by 'Should make hadoop.profile required?'? You are saying you should not have to add this when building? Specifying version hbase-0.95.1-hadoop2 should be enough? I agree.

          Show
          stack added a comment - Brock Noland Good to know. Thank you for trying. Please explain more what you mean by 'Should make hadoop.profile required?'? You are saying you should not have to add this when building? Specifying version hbase-0.95.1-hadoop2 should be enough? I agree.
          Hide
          Brock Noland added a comment -

          I have a pom which depends on the hadoop2 client jar:

            <dependency>
              <groupId>org.apache.hbase</groupId>
              <artifactId>hbase-client</artifactId>
              <version>0.95.1-hadoop2-SNAPSHOT</version>
            </dependency>
          

          and it works fine if I specify -Dhadoop.profile=2.0

          [brock@bigboy hbase-client-hadoop2-example]$ mvn dependency:tree -Dhadoop.profile=2.0
          ...
          [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ hbase-client-hadoop2-example ---
          [INFO] org.apache.hbase:hbase-client-hadoop2-example:jar:1.0
          [INFO] \- org.apache.hbase:hbase-client:jar:0.95.1-hadoop2-SNAPSHOT:compile
          ...
          [INFO]    +- org.apache.hadoop:hadoop-client:jar:2.0.2-alpha:compile
          

          However if hadoop.profile is left off, then it pulls in 1.X. Should make hadoop.profile required?

          Show
          Brock Noland added a comment - I have a pom which depends on the hadoop2 client jar: <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>0.95.1-hadoop2-SNAPSHOT</version> </dependency> and it works fine if I specify -Dhadoop.profile=2.0 [brock@bigboy hbase-client-hadoop2-example]$ mvn dependency:tree -Dhadoop.profile=2.0 ... [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ hbase-client-hadoop2-example --- [INFO] org.apache.hbase:hbase-client-hadoop2-example:jar:1.0 [INFO] \- org.apache.hbase:hbase-client:jar:0.95.1-hadoop2-SNAPSHOT:compile ... [INFO] +- org.apache.hadoop:hadoop-client:jar:2.0.2-alpha:compile However if hadoop.profile is left off, then it pulls in 1.X. Should make hadoop.profile required?
          Hide
          stack added a comment -

          Brock Noland No sir. I have deployed a hbase-hadoop2 SNAPSHOT but do not think it will work. I believe downstreamers will still have the included hbase transitively include hadoop1 rather than hadoop2 going by the pom (see http://search-hadoop.com/m/PHo512tP1po/pom+hadoop2+hbase&subj=Re+VOTE+hbase+0+95+1+release+candidate+1+is+available+for+download for Nicolas's reporting this and then Elliott's workaround). You have any input/opinion sir?

          Show
          stack added a comment - Brock Noland No sir. I have deployed a hbase-hadoop2 SNAPSHOT but do not think it will work. I believe downstreamers will still have the included hbase transitively include hadoop1 rather than hadoop2 going by the pom (see http://search-hadoop.com/m/PHo512tP1po/pom+hadoop2+hbase&subj=Re+VOTE+hbase+0+95+1+release+candidate+1+is+available+for+download for Nicolas's reporting this and then Elliott's workaround). You have any input/opinion sir?
          Hide
          Brock Noland added a comment -

          Stack have you received any feedback on the hadoop2 artifact?

          Show
          Brock Noland added a comment - Stack have you received any feedback on the hadoop2 artifact?
          Hide
          stack added a comment -

          I just published snapshots again using above technique. Let me see if I can get downstreamers to take a look.

          Show
          stack added a comment - I just published snapshots again using above technique. Let me see if I can get downstreamers to take a look.
          Hide
          stack added a comment -

          Thank you for sharing the pain Enis Soztutar.

          Here is how I did snapshot upload to maven repo:

           3547  svn revert -R .
           3548  svn cleanup
           3549  mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.95.0-hadoop1-SNAPSHOT
           3551  mvn -DskipTests install deploy
           3552  svn revert -R .
           3553  svn cleanup
           3554  mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.95.0-hadoop2-SNAPSHOT
           3555  mvn -DskipTests -Dhadoop.profile=2.0 install deploy
          

          It looks ok... https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-client/ POMs don't explicitly reference a hadoop version though which is a little disconcerting.

          Show
          stack added a comment - Thank you for sharing the pain Enis Soztutar . Here is how I did snapshot upload to maven repo: 3547 svn revert -R . 3548 svn cleanup 3549 mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.95.0-hadoop1-SNAPSHOT 3551 mvn -DskipTests install deploy 3552 svn revert -R . 3553 svn cleanup 3554 mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.95.0-hadoop2-SNAPSHOT 3555 mvn -DskipTests -Dhadoop.profile=2.0 install deploy It looks ok... https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-client/ POMs don't explicitly reference a hadoop version though which is a little disconcerting.
          Hide
          Enis Soztutar added a comment -

          I did a bit more digging, but ran into a fundamental problem. Overriding the artifactId from a property coming from the profile works, but when a project depends on the installed artifact:

              <dependency>
                <groupId>org.apache.hbase</groupId>
                <artifactId>hbase-server-hadoop2</artifactId>
                <version>0.97-SNAPSHOT</version>
              </dependency>
            </dependencies>
          

          then, there is no profile information, so it will just try to fetch the jars from hadoop1 versions. I tried to circumvent this with putting a parent pom for hadoop2 profile (pom-hadoop2.xml) which is a child of pom.xml, but also declares child modules hbase-client, etc. This also fails to work because the parent artifactId does not accept variables.

          Show
          Enis Soztutar added a comment - I did a bit more digging, but ran into a fundamental problem. Overriding the artifactId from a property coming from the profile works, but when a project depends on the installed artifact: <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-server-hadoop2</artifactId> <version>0.97-SNAPSHOT</version> </dependency> </dependencies> then, there is no profile information, so it will just try to fetch the jars from hadoop1 versions. I tried to circumvent this with putting a parent pom for hadoop2 profile (pom-hadoop2.xml) which is a child of pom.xml, but also declares child modules hbase-client, etc. This also fails to work because the parent artifactId does not accept variables.
          Hide
          stack added a comment -

          Here is what I did making the release just now:

            2  3491  rm -rf ~/Downloads/hbase-0.95.0/*
            3  3490  mvn clean -Prelease -Dassembly.file=src/assembly/src.xml install -DskipTests assembly:single
            4  3492  cp hbase-assembly/target/hbase-0.95.0-src.tar.gz ~/Downloads/hbase-0.95.0/
            5  3493  svn revert -R .
            6  3494  svn cleanup
            7  3495  mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.95.0-hadoop1
            8  3496  mvn clean -Prelease  install -DskipTests javadoc:aggregate site assembly:single
            9  3497  cp hbase-assembly/target/hbase-0.95.0-hadoop1-bin.tar.gz ~/Downloads/hbase-0.95.0/
           10  3498  svn revert -R .
           11  3499  svn cleanup
           12  3501  mvn clean -Prelease -Dhadoop.profile=2.0 install -DskipTests javadoc:aggregate site assembly:single
           13  3502  mv hbase-assembly/target/hbase-0.95.0-hadoop2-bin.tar.gz ~/Downloads/hbase-0.95.0/
           14  3513  tail -f logs/hbase-stack-master-durruti.local.log.
           15  3518  svn copy https://svn.apache.org/repos/asf/hbase/branches/0.95 https://svn.apache.org/repos/asf/hbase/tags/0.95.0RC0 -m "Tag 0.95.0RC0"
           16  3520  cd Downloads/hbase-0.95.0/
           17  3522  gpg --print-mds hbase-0.95.0-src.tar.gz > hbase-0.95.0-src.tar.gz.mds
           18  3524  gpg --print-mds hbase-0.95.0-hadoop1-bin.tar.gz > hbase-0.95.0-hadoop1-bin.tar.gz.mds
           19  3525  gpg --print-mds hbase-0.95.0-hadoop2-bin.tar.gz > hbase-0.95.0-hadoop2-bin.tar.gz.mds
           20  3527  gpg --armor --output hbase-0.95.0-src.tar.gz.asc --detach-sig hbase-0.95.0-src.tar.gz
           21  3530  gpg --armor --output hbase-0.95.0-hadoop1-bin.tar.gz.asc --detach-sig hbase-0.95.0-hadoop1-bin.tar.gz
           22  3531  gpg --armor --output hbase-0.95.0-hadoop2-bin.tar.gz.asc --detach-sig hbase-0.95.0-hadoop2-bin.tar.gz
           23  3533  cd ../
           24  3534  mv hbase-0.95.0 hbase-0.95.0RC0
           25  3535  tar cfz hbase-0.95.0RC0.tgz  hbase-0.95.0RC0
           26  3539  scp hbase-0.95.0RC0.tgz people.apache.org:
          

          I set the version using mvn versions and then I'd do my build of src tgz, copy it over to a 0.95 dir; then clean up the changes made by the mvn versions by doing svn revert.... ; then run it again to set hadoop1, and do similar for hadoop2.

          Let me push up to mvn repo....too.

          Show
          stack added a comment - Here is what I did making the release just now: 2 3491 rm -rf ~/Downloads/hbase-0.95.0/* 3 3490 mvn clean -Prelease -Dassembly.file=src/assembly/src.xml install -DskipTests assembly:single 4 3492 cp hbase-assembly/target/hbase-0.95.0-src.tar.gz ~/Downloads/hbase-0.95.0/ 5 3493 svn revert -R . 6 3494 svn cleanup 7 3495 mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=0.95.0-hadoop1 8 3496 mvn clean -Prelease install -DskipTests javadoc:aggregate site assembly:single 9 3497 cp hbase-assembly/target/hbase-0.95.0-hadoop1-bin.tar.gz ~/Downloads/hbase-0.95.0/ 10 3498 svn revert -R . 11 3499 svn cleanup 12 3501 mvn clean -Prelease -Dhadoop.profile=2.0 install -DskipTests javadoc:aggregate site assembly:single 13 3502 mv hbase-assembly/target/hbase-0.95.0-hadoop2-bin.tar.gz ~/Downloads/hbase-0.95.0/ 14 3513 tail -f logs/hbase-stack-master-durruti.local.log. 15 3518 svn copy https: //svn.apache.org/repos/asf/hbase/branches/0.95 https://svn.apache.org/repos/asf/hbase/tags/0.95.0RC0 -m "Tag 0.95.0RC0" 16 3520 cd Downloads/hbase-0.95.0/ 17 3522 gpg --print-mds hbase-0.95.0-src.tar.gz > hbase-0.95.0-src.tar.gz.mds 18 3524 gpg --print-mds hbase-0.95.0-hadoop1-bin.tar.gz > hbase-0.95.0-hadoop1-bin.tar.gz.mds 19 3525 gpg --print-mds hbase-0.95.0-hadoop2-bin.tar.gz > hbase-0.95.0-hadoop2-bin.tar.gz.mds 20 3527 gpg --armor --output hbase-0.95.0-src.tar.gz.asc --detach-sig hbase-0.95.0-src.tar.gz 21 3530 gpg --armor --output hbase-0.95.0-hadoop1-bin.tar.gz.asc --detach-sig hbase-0.95.0-hadoop1-bin.tar.gz 22 3531 gpg --armor --output hbase-0.95.0-hadoop2-bin.tar.gz.asc --detach-sig hbase-0.95.0-hadoop2-bin.tar.gz 23 3533 cd ../ 24 3534 mv hbase-0.95.0 hbase-0.95.0RC0 25 3535 tar cfz hbase-0.95.0RC0.tgz hbase-0.95.0RC0 26 3539 scp hbase-0.95.0RC0.tgz people.apache.org: I set the version using mvn versions and then I'd do my build of src tgz, copy it over to a 0.95 dir; then clean up the changes made by the mvn versions by doing svn revert.... ; then run it again to set hadoop1, and do similar for hadoop2. Let me push up to mvn repo....too.
          Hide
          stack added a comment -

          Oh, interpolating into artifactid works? Let me mess with it... will report back. Thanks Enis Soztutar

          Show
          stack added a comment - Oh, interpolating into artifactid works? Let me mess with it... will report back. Thanks Enis Soztutar
          Hide
          Enis Soztutar added a comment -

          stack could you please take a quick look at the prototype patch.

           mvn clean install -DskipTests  
          

          builds with hadoop1 suffix:

          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop1-0.97-SNAPSHOT.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT.jar
          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/pom.xml to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT.pom
          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop1-0.97-SNAPSHOT-tests.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT-tests.jar
          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop1-0.97-SNAPSHOT-sources.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT-sources.jar
          

          For hadoop2 profile:

          mvn install -DskipTests -Dhadoop.profile=2.0 
          

          results in

          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop2-0.97-SNAPSHOT.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT.jar
          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/pom.xml to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT.pom
          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop2-0.97-SNAPSHOT-tests.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT-tests.jar
          [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop2-0.97-SNAPSHOT-sources.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT-sources.jar
          

          Not sure whether this will work with the assembly or maven publish, but install seems fine.

          Show
          Enis Soztutar added a comment - stack could you please take a quick look at the prototype patch. mvn clean install -DskipTests builds with hadoop1 suffix: [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop1-0.97-SNAPSHOT.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT.jar [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/pom.xml to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT.pom [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop1-0.97-SNAPSHOT-tests.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT-tests.jar [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop1-0.97-SNAPSHOT-sources.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop1/0.97-SNAPSHOT/hbase-server-hadoop1-0.97-SNAPSHOT-sources.jar For hadoop2 profile: mvn install -DskipTests -Dhadoop.profile=2.0 results in [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop2-0.97-SNAPSHOT.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT.jar [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/pom.xml to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT.pom [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop2-0.97-SNAPSHOT-tests.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT-tests.jar [INFO] Installing /Users/enis/projects/svn-repos/hbase/hbase-server/target/hbase-server-hadoop2-0.97-SNAPSHOT-sources.jar to /Users/enis/.m2/repository/org/apache/hbase/hbase-server-hadoop2/0.97-SNAPSHOT/hbase-server-hadoop2-0.97-SNAPSHOT-sources.jar Not sure whether this will work with the assembly or maven publish, but install seems fine.
          Hide
          Enis Soztutar added a comment -

          He was talking about the artifact name containing the hadoop1 or 2 suffix, instead of the version. It is close to the one I was talking about above, where we have two poms per module.

          The module artifacts will be named hbase-client-hadoop1, and the version will still be 0.95.0. So the final artifact name will be hbase-client-hadoop1-0.95.0.jar.

          We can supply an alternate file using mvn -f flag. But having the build is already modular, we have to test it.

          Show
          Enis Soztutar added a comment - He was talking about the artifact name containing the hadoop1 or 2 suffix, instead of the version. It is close to the one I was talking about above, where we have two poms per module. The module artifacts will be named hbase-client-hadoop1, and the version will still be 0.95.0. So the final artifact name will be hbase-client-hadoop1-0.95.0.jar. We can supply an alternate file using mvn -f flag. But having the build is already modular, we have to test it.
          Hide
          stack added a comment -

          Enis Soztutar How would that work? What would the artifact be? Would we have two poms or would be branch to do hadoop2 builds making artifact name change in branch? Thanks.

          Show
          stack added a comment - Enis Soztutar How would that work? What would the artifact be? Would we have two poms or would be branch to do hadoop2 builds making artifact name change in branch? Thanks.
          Hide
          Enis Soztutar added a comment -

          I've talked with one of our build guys, Giridharan Kesavan about this. He says that we should rather change the artifact name, instead of the version string.

          Show
          Enis Soztutar added a comment - I've talked with one of our build guys, Giridharan Kesavan about this. He says that we should rather change the artifact name, instead of the version string.
          Hide
          stack added a comment -

          The plugin examples are all about command-line. I tried to add a 'configuration' after adding plugin to the hadoop2 profile and got this:

          [ERROR]   The project org.apache.hbase:hbase:0.97.0-SNAPSHOT (/Users/stack/checkouts/hbase/pom.xml) has 1 error
          [ERROR]     Malformed POM /Users/stack/checkouts/hbase/pom.xml: Unrecognised tag: 'configuration' (position: START_TAG seen ...</version>\n            <configuration>... @1380:28)  @ /Users/stack/checkouts/hbase/pom.xml, line 1380, column 28 -> [Help 2]
          

          ... so it doesn't seem to like it.

          Might be dodgy doing this automatically given it changes all poms and leaves them changed... (this is all so dirty!) Thanks Enis.

          Show
          stack added a comment - The plugin examples are all about command-line. I tried to add a 'configuration' after adding plugin to the hadoop2 profile and got this: [ERROR] The project org.apache.hbase:hbase:0.97.0-SNAPSHOT (/Users/stack/checkouts/hbase/pom.xml) has 1 error [ERROR] Malformed POM /Users/stack/checkouts/hbase/pom.xml: Unrecognised tag: 'configuration' (position: START_TAG seen ...</version>\n <configuration>... @1380:28) @ /Users/stack/checkouts/hbase/pom.xml, line 1380, column 28 -> [Help 2] ... so it doesn't seem to like it. Might be dodgy doing this automatically given it changes all poms and leaves them changed... (this is all so dirty!) Thanks Enis.
          Hide
          Enis Soztutar added a comment -

          Sounds good to me. Can we automagically invoke this thing when the hadoop2 profile is on?

          Show
          Enis Soztutar added a comment - Sounds good to me. Can we automagically invoke this thing when the hadoop2 profile is on?
          Hide
          stack added a comment -

          Parent version must be hard-coded throughout (http://jira.codehaus.org/browse/MNG-624).

          The versions plugin rewrites the pom version in parent and in submodules for you (have to use 1.3.1, the 2.0 is broke NPE'ing – https://jira.codehaus.org/browse/MVERSIONS-201)

          $ mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=1.2.3-hadoop2-SNAPSHOT -Dhadoop.profile=2.0 install -DskipTests assembly:single
          

          So, unless someone has a better idea, I'm thinking that when we build to publish, we use this ugly versions plugin to create jars w/ -hadoop1 suffix and -hadoop2 suffix. I suppose will have to commit what the versions plugin writes and then tag. Do it once for hadoop1 and once for hadoop2.

          Let me try this for first 0.95RC

          Show
          stack added a comment - Parent version must be hard-coded throughout ( http://jira.codehaus.org/browse/MNG-624 ). The versions plugin rewrites the pom version in parent and in submodules for you (have to use 1.3.1, the 2.0 is broke NPE'ing – https://jira.codehaus.org/browse/MVERSIONS-201 ) $ mvn clean org.codehaus.mojo:versions-maven-plugin:1.3.1:set -DnewVersion=1.2.3-hadoop2-SNAPSHOT -Dhadoop.profile=2.0 install -DskipTests assembly:single So, unless someone has a better idea, I'm thinking that when we build to publish, we use this ugly versions plugin to create jars w/ -hadoop1 suffix and -hadoop2 suffix. I suppose will have to commit what the versions plugin writes and then tag. Do it once for hadoop1 and once for hadoop2. Let me try this for first 0.95RC
          Hide
          stack added a comment -

          Enis Soztutar We have 11 modules of which at least 5 would need the shadow module you suggest. That is a bunch of modules. And then there is hadoop3 one day?

          I tried again to make the classifiers work (if they worked, it'd be 'elegant' – caveat we are talking maven world here). I tried adding the dependencies to hbase-assembly so we could then do dependencySet in the assembly descriptor. I could get the hbase-*-hadoop?.jars to show up but they would come w/o their dependencies. I tried getting arty w/ maven coordinate specifications and dependencySet but it started to turn erratic on me.

          Things I've learned (not sure if they are hard and fast rules – but I have not been able to get past them):

          + We cannot read the pom.version off the command-line. It has to be hard-coded in the top-level pom and in all submodules. I cannot interpolate a property into the pom.version either (to add hadoop1 or hadoop2 suffix); mvn just complains.
          + We could add hadoop? to the version but there is no connection between version and hadoop profile so we could have a hadoop1 added to jar names though we build the assembly with the hadoop2 profile (Maybe this is not too bad.... Serves you right if you end up w/ this combo).

          How we going to change the version though given the above? Making the assembly package, its not too bad. Could just name the assembly/package for the hadoop version and leave the jar alone unadorned with hadoop version. Publishing to maven we'd do something crazy like change the version number, commit, build and upload for hadoop1, then commit again, build, and upload for hadoop2?

          Any other ideas?

          Show
          stack added a comment - Enis Soztutar We have 11 modules of which at least 5 would need the shadow module you suggest. That is a bunch of modules. And then there is hadoop3 one day? I tried again to make the classifiers work (if they worked, it'd be 'elegant' – caveat we are talking maven world here). I tried adding the dependencies to hbase-assembly so we could then do dependencySet in the assembly descriptor. I could get the hbase-*-hadoop?.jars to show up but they would come w/o their dependencies. I tried getting arty w/ maven coordinate specifications and dependencySet but it started to turn erratic on me. Things I've learned (not sure if they are hard and fast rules – but I have not been able to get past them): + We cannot read the pom.version off the command-line. It has to be hard-coded in the top-level pom and in all submodules. I cannot interpolate a property into the pom.version either (to add hadoop1 or hadoop2 suffix); mvn just complains. + We could add hadoop? to the version but there is no connection between version and hadoop profile so we could have a hadoop1 added to jar names though we build the assembly with the hadoop2 profile (Maybe this is not too bad.... Serves you right if you end up w/ this combo). How we going to change the version though given the above? Making the assembly package, its not too bad. Could just name the assembly/package for the hadoop version and leave the jar alone unadorned with hadoop version. Publishing to maven we'd do something crazy like change the version number, commit, build and upload for hadoop1, then commit again, build, and upload for hadoop2? Any other ideas?
          Hide
          Enis Soztutar added a comment -

          In one of the comments here: http://maven.40175.n5.nabble.com/How-to-deploy-with-classifier-td5523009.html , they basically say that one module should correspond to one artifact, and a bunch of discussions around the jdk14, jdk15 classifiers.
          How about creating one empty module per hbase module, and put the hadoop2 poms there.

          hbase-client
          hbase-hadoop-2-support/hbase-client-hadoop2
          hbase-server
          hbase-hadoop-2-support/hbase-server-hadoop2
          ....
          

          Then we do not even have to change the version, or have a profile for Hadoop2. All will be build. Wdyt boss. Overkill?

          Show
          Enis Soztutar added a comment - In one of the comments here: http://maven.40175.n5.nabble.com/How-to-deploy-with-classifier-td5523009.html , they basically say that one module should correspond to one artifact, and a bunch of discussions around the jdk14, jdk15 classifiers. How about creating one empty module per hbase module, and put the hadoop2 poms there. hbase-client hbase-hadoop-2-support/hbase-client-hadoop2 hbase-server hbase-hadoop-2-support/hbase-server-hadoop2 .... Then we do not even have to change the version, or have a profile for Hadoop2. All will be build. Wdyt boss. Overkill?
          Hide
          stack added a comment -

          Enis turned up the old issue where we fought classifiers and lost.

          Show
          stack added a comment - Enis turned up the old issue where we fought classifiers and lost.
          Hide
          stack added a comment -

          Or, what am I going to do? If version is 0.97.0-hadoop1-SNAPSHOT and on command-line you specify -Dhadoop.profile=2.0? You'll get a jar that has -hadoop1 suffix though it was built against hadoop2 (I tried having the hadoop.profile property included in pom.version but it just says not allowed and fails)

          Show
          stack added a comment - Or, what am I going to do? If version is 0.97.0-hadoop1-SNAPSHOT and on command-line you specify -Dhadoop.profile=2.0? You'll get a jar that has -hadoop1 suffix though it was built against hadoop2 (I tried having the hadoop.profile property included in pom.version but it just says not allowed and fails)
          Hide
          stack added a comment -

          I tried adding classifiers. Was fine till I got to the assembly step:

          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.4:single (default-cli) on project hbase-assembly: Failed to create assembly: Error adding file 'org.apache.hbase:hbase-common:jar:0.97.0-SNAPSHOT' to archive: /Users/stack/checkouts/hbase/hbase-common/target/classes isn't a file. -> [Help 1]
          

          I couldn't get over the above. There is no vocabulary around moduleSet and dependencySet to deal w/ classifiers; they seem blind unless I try and call out what to include individually (failed trying to do this too).

          Giving up on this avenue. Going to add to our version.

          Show
          stack added a comment - I tried adding classifiers. Was fine till I got to the assembly step: [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.4:single ( default -cli) on project hbase-assembly: Failed to create assembly: Error adding file 'org.apache.hbase:hbase-common:jar:0.97.0-SNAPSHOT' to archive: /Users/stack/checkouts/hbase/hbase-common/target/classes isn't a file. -> [Help 1] I couldn't get over the above. There is no vocabulary around moduleSet and dependencySet to deal w/ classifiers; they seem blind unless I try and call out what to include individually (failed trying to do this too). Giving up on this avenue. Going to add to our version.
          Hide
          stack added a comment -

          Wing Yew up on mailing lists suggests it could work with classifiers:

          you could take a look at
          http://svn.apache.org/repos/asf/avro/tags/release-1.7.3/lang/java/mapred/pom.xml
          to see how they do it with classifiers.

          My understanding was that we could not use classifiers but I could try the above suggestion; would be easier than changing version.

          Show
          stack added a comment - Wing Yew up on mailing lists suggests it could work with classifiers: you could take a look at http://svn.apache.org/repos/asf/avro/tags/release-1.7.3/lang/java/mapred/pom.xml to see how they do it with classifiers. My understanding was that we could not use classifiers but I could try the above suggestion; would be easier than changing version.

            People

            • Assignee:
              stack
              Reporter:
              stack
            • Votes:
              0 Vote for this issue
              Watchers:
              10 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development