Giraph
  1. Giraph
  2. GIRAPH-168

Simplify munge directive usage with new munge flag HADOOP_SECURE (rather than HADOOP_FACEBOOK) and remove usage of HADOOP

    Details

    • Type: Improvement Improvement
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 1.0.0
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      This JIRA relates to the mail thread here:

      http://mail-archives.apache.org/mod_mbox/incubator-giraph-dev/201203.mbox/browser

      Currently we check for the munge flags HADOOP, HADOOP_FACEBOOK and HADOOP_NON_SECURE when using munge in a few places. Hopefully we can eliminate usage of munge in the future, but until then, we can mitigate the complexity by consolidating the number of flags checked. This JIRA renames HADOOP_FACEBOOK to HADOOP_SECURE, and removes usages of HADOOP, to handle the same conditional compilation requirements. It also makes it easier to add more maven profiles so that we can easily increase our hadoop version coverage.

      This patch modifies the existing hadoop_facebook profile to use the new HADOOP_SECURE munge flag, rather than HADOOP_FACEBOOK.

      It also adds a new hadoop maven profile, hadoop_trunk, which also sets HADOOP_SECURE.

      Finally, it adds a default profile, hadoop_0.20.203. This is needed so that we can specify its dependencies separately from hadoop_trunk, because the hadoop dependencies have changed between trunk and 0.205.0 - the former requires hadoop-common, hadoop-mapreduce-client-core, and hadoop-mapreduce-client-common, whereas the latter requires hadoop-core.

      With this patch, the following passes:

      mvn clean verify && mvn -Phadoop_trunk clean verify && mvn -Phadoop_0.20.203 clean verify
      

      Current problems:

      • I couldn't get -Phadoop_facebook to work; does this work outside of Facebook?
      1. GIRAPH-168.patch
        23 kB
        Eugene Koontz
      2. GIRAPH-168.patch
        25 kB
        Eugene Koontz
      3. GIRAPH-168.patch
        19 kB
        Eugene Koontz
      4. GIRAPH-168.patch
        19 kB
        Eugene Koontz
      5. GIRAPH-168.patch
        19 kB
        Eugene Koontz
      6. GIRAPH-168.patch
        16 kB
        Eugene Koontz

        Issue Links

          Activity

          Hide
          Jakob Homan added a comment -

          Avery, can you or another FBer verify that the only API changes FB made to their distro has to do with security? I was under the impression there were others as well, but I don't know if they hit any code areas we use...

          Show
          Jakob Homan added a comment - Avery, can you or another FBer verify that the only API changes FB made to their distro has to do with security? I was under the impression there were others as well, but I don't know if they hit any code areas we use...
          Hide
          Avery Ching added a comment -

          I'll verify the FB version.

          Show
          Avery Ching added a comment - I'll verify the FB version.
          Hide
          Avery Ching added a comment -

          It's mostly the RPC changes for the FB version. I'm having some trouble applying this patch against trunk:

          patch -p0 < ~/Desktop/GIRAPH-168.patch
          patching file pom.xml
          Hunk #1 succeeded at 455 (offset 1 line).
          Hunk #2 succeeded at 578 (offset 14 lines).
          Hunk #3 succeeded at 596 (offset 14 lines).
          Hunk #4 succeeded at 671 (offset 14 lines).
          patching file src/main/java/org/apache/giraph/GiraphRunner.java
          Reversed (or previously applied) patch detected! Assume -R? [n] n
          Apply anyway? [n] n
          Skipping patch.
          2 out of 2 hunks ignored – saving rejects to file src/main/java/org/apache/giraph/GiraphRunner.java.rej
          patching file src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java
          patching file src/main/java/org/apache/giraph/comm/RPCCommunications.java
          Hunk #6 FAILED at 165.
          1 out of 6 hunks FAILED – saving rejects to file src/main/java/org/apache/giraph/comm/RPCCommunications.java.rej
          patching file src/test/java/org/apache/giraph/TestBspBasic.java

          Show
          Avery Ching added a comment - It's mostly the RPC changes for the FB version. I'm having some trouble applying this patch against trunk: patch -p0 < ~/Desktop/ GIRAPH-168 .patch patching file pom.xml Hunk #1 succeeded at 455 (offset 1 line). Hunk #2 succeeded at 578 (offset 14 lines). Hunk #3 succeeded at 596 (offset 14 lines). Hunk #4 succeeded at 671 (offset 14 lines). patching file src/main/java/org/apache/giraph/GiraphRunner.java Reversed (or previously applied) patch detected! Assume -R? [n] n Apply anyway? [n] n Skipping patch. 2 out of 2 hunks ignored – saving rejects to file src/main/java/org/apache/giraph/GiraphRunner.java.rej patching file src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java patching file src/main/java/org/apache/giraph/comm/RPCCommunications.java Hunk #6 FAILED at 165. 1 out of 6 hunks FAILED – saving rejects to file src/main/java/org/apache/giraph/comm/RPCCommunications.java.rej patching file src/test/java/org/apache/giraph/TestBspBasic.java
          Hide
          Eugene Koontz added a comment -

          As Avery mentioned, there are changes in the RPC function signatures that vary across Hadoop versions.

          Therefore, beside HADOOP_SECURE, this patch also adds a HADOOP_NEWRPC munge flag. The flags are set in pom.xml as follows ('x' means the flag is set):

          profile HADOOP_SECURE HADOOP_NEWRPC
          hadoop_non_secure
          hadoop_0.20.203 x
          hadoop_0.23 x x
          hadoop_trunk x x
          hadoop_facebook x x

          In my tests, "mvn -Phadoop_X clean verify" works for all X in

          {0.20.203,0.23,trunk,non_secure}

          .

          Show
          Eugene Koontz added a comment - As Avery mentioned, there are changes in the RPC function signatures that vary across Hadoop versions. Therefore, beside HADOOP_SECURE, this patch also adds a HADOOP_NEWRPC munge flag. The flags are set in pom.xml as follows ('x' means the flag is set): profile HADOOP_SECURE HADOOP_NEWRPC hadoop_non_secure hadoop_0.20.203 x hadoop_0.23 x x hadoop_trunk x x hadoop_facebook x x In my tests, "mvn -Phadoop_X clean verify" works for all X in {0.20.203,0.23,trunk,non_secure} .
          Hide
          Eugene Koontz added a comment -

          Latest patch "flips" the set of munge directives from

          {HADOOP_NEWRPC, HADOOP_SECURE}

          to

          {HADOOP_OLDRPC,HADOOP_NON_SECURE}

          . HADOOP_NON_SECURE is a flag used currently in trunk, so this is a return back to the current trunk state.

          Making old-RPC-signature and non-secure be the exceptional cases seems to me better because if we remove older Hadoop versions, we'll have also removed the need for having any munge directives.

          Please see the flag/profile matrix for this patch below:

          profile HADOOP_OLDRPC HADOOP_NON_SECURE
          hadoop_non_secure x x
          hadoop_0.20.203 x
          hadoop_0.23    
          hadoop_trunk    
          hadoop_facebook    
          Show
          Eugene Koontz added a comment - Latest patch "flips" the set of munge directives from {HADOOP_NEWRPC, HADOOP_SECURE} to {HADOOP_OLDRPC,HADOOP_NON_SECURE} . HADOOP_NON_SECURE is a flag used currently in trunk, so this is a return back to the current trunk state. Making old-RPC-signature and non-secure be the exceptional cases seems to me better because if we remove older Hadoop versions, we'll have also removed the need for having any munge directives. Please see the flag/profile matrix for this patch below: profile HADOOP_OLDRPC HADOOP_NON_SECURE hadoop_non_secure x x hadoop_0.20.203 x hadoop_0.23     hadoop_trunk     hadoop_facebook    
          Hide
          Jakob Homan added a comment -

          My understanding was that the RPC changes FB had made were backports of changes that are in later versions, so I'm not sure if OldRPC is the correct description. Also, within the Hadoop world there's not really talk of old versus new RPC (except for the PB-based stuff, which will make this really confusing...). Hadoop security is API-incompatible with Hadoop non-security (due to changes in UGI) and FB's distro is insecure and API incompatible due to new APIs backported from more modern versions.

          Show
          Jakob Homan added a comment - My understanding was that the RPC changes FB had made were backports of changes that are in later versions, so I'm not sure if OldRPC is the correct description. Also, within the Hadoop world there's not really talk of old versus new RPC (except for the PB-based stuff, which will make this really confusing...). Hadoop security is API-incompatible with Hadoop non-security (due to changes in UGI) and FB's distro is insecure and API incompatible due to new APIs backported from more modern versions.
          Hide
          Jakob Homan added a comment -

          except for the PB-based stuf

          Where PB = ProtocolBuffers and != FB because this isn't quite confusing enough.

          Show
          Jakob Homan added a comment - except for the PB-based stuf Where PB = ProtocolBuffers and != FB because this isn't quite confusing enough.
          Hide
          Eugene Koontz added a comment -

          HADOOP-6419 seems to be the source of the divergence in RPC usage that necessitates the munge directives.

          Show
          Eugene Koontz added a comment - HADOOP-6419 seems to be the source of the divergence in RPC usage that necessitates the munge directives.
          Hide
          Eugene Koontz added a comment -

          Hi Jakob, I wonder if HADOOP_NO_SASL might be better than HADOOP_OLDRPC (since the divergence in RPC has to do with HADOOP-6419 ("Change RPC layer to support SASL based mutual authentication"))?

          Show
          Eugene Koontz added a comment - Hi Jakob, I wonder if HADOOP_NO_SASL might be better than HADOOP_OLDRPC (since the divergence in RPC has to do with HADOOP-6419 ("Change RPC layer to support SASL based mutual authentication"))?
          Hide
          Eugene Koontz added a comment -

          -Update hadoop_trunk to look for Hadoop version 3.0.0-SNAPSHOT
          -change HADOOP_OLDRPC munge flag to more descriptive HADOOP_NON_SASL_RPC

          Show
          Eugene Koontz added a comment - -Update hadoop_trunk to look for Hadoop version 3.0.0-SNAPSHOT -change HADOOP_OLDRPC munge flag to more descriptive HADOOP_NON_SASL_RPC
          Hide
          Avery Ching added a comment -

          Eugene, can you also please update the README with your changes?

          I was able to apply your patch, but ran into the following error:

          vn verify -DskipTests -Dhadoop=facebook -Dhadoop.jar.path=/Users/aching/Avery/FacebookJars/hadoop-0.20-core.jar
          [INFO] Scanning for projects...
          [INFO]
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Apache Incubator Giraph 0.2-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO]
          [INFO] — maven-enforcer-plugin:1.0.1:enforce (enforce-maven) @ giraph —
          [INFO]
          [INFO] — munge-maven-plugin:1.0:munge (munge) @ giraph —
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 1.225s
          [INFO] Finished at: Fri Apr 06 11:21:29 PDT 2012
          [INFO] Final Memory: 6M/81M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.sonatype.plugins:munge-maven-plugin:1.0:munge (munge) on project giraph: The parameters 'symbols' for goal org.sonatype.plugins:munge-maven-plugin:1.0:munge are missing or invalid -> [Help 1]
          [ERROR]
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR]
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginParameterException

          Show
          Avery Ching added a comment - Eugene, can you also please update the README with your changes? I was able to apply your patch, but ran into the following error: vn verify -DskipTests -Dhadoop=facebook -Dhadoop.jar.path=/Users/aching/Avery/FacebookJars/hadoop-0.20-core.jar [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Incubator Giraph 0.2-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] — maven-enforcer-plugin:1.0.1:enforce (enforce-maven) @ giraph — [INFO] [INFO] — munge-maven-plugin:1.0:munge (munge) @ giraph — [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1.225s [INFO] Finished at: Fri Apr 06 11:21:29 PDT 2012 [INFO] Final Memory: 6M/81M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.sonatype.plugins:munge-maven-plugin:1.0:munge (munge) on project giraph: The parameters 'symbols' for goal org.sonatype.plugins:munge-maven-plugin:1.0:munge are missing or invalid -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginParameterException
          Hide
          Eugene Koontz added a comment -

          HADOOP-6904 introduced the interface method VersionedProtocol.getProtocolSignature(), which requires munge support in src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java.

          Show
          Eugene Koontz added a comment - HADOOP-6904 introduced the interface method VersionedProtocol.getProtocolSignature(), which requires munge support in src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java.
          Hide
          Eugene Koontz added a comment -

          Avery, thanks a lot for your comments and trying out the patch. This latest patch should work with the Facebook branch. I did:

          git clone https://github.com/facebook/hadoop-20.git
          

          and then ant package, and then I did in my giraph directory(with the latest patch):

           mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify
          

          and this succeeded.

          I noticed that although the Facebook branch does not have SASL or Security, it does seem to have the inter-versioning support provided by HADOOP-6904. So, I added another munge flag, called HADOOP_VERSIONED_RPC, per the following updated matrix:

          profile HADOOP_NON_SECURE HADOOP_NON_SASL_RPC HADOOP_NON_INTERVERSIONED_RPC
          hadoop_non_secure x x x
          hadoop_facebook x x  
          hadoop_0.20.203   x x
          hadoop_0.23      
          hadoop_trunk      

          I updated the README also in the patch.

          This patch works for me for mvn -PX clean verify for all of the above versions, and also with no -P given (which is equivalent to -Phadoop_0.20.203).

          Show
          Eugene Koontz added a comment - Avery, thanks a lot for your comments and trying out the patch. This latest patch should work with the Facebook branch. I did: git clone https: //github.com/facebook/hadoop-20.git and then ant package , and then I did in my giraph directory(with the latest patch): mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify and this succeeded. I noticed that although the Facebook branch does not have SASL or Security, it does seem to have the inter-versioning support provided by HADOOP-6904 . So, I added another munge flag, called HADOOP_VERSIONED_RPC, per the following updated matrix: profile HADOOP_NON_SECURE HADOOP_NON_SASL_RPC HADOOP_NON_INTERVERSIONED_RPC hadoop_non_secure x x x hadoop_facebook x x   hadoop_0.20.203   x x hadoop_0.23       hadoop_trunk       I updated the README also in the patch. This patch works for me for mvn -PX clean verify for all of the above versions, and also with no -P given (which is equivalent to -Phadoop_0.20.203).
          Hide
          Avery Ching added a comment -

          Nice that you got it working with all the versions! One question though, why is the line below needed in pom.xml?

          <org.apache.hadoop.giraph.zkJar>giraph-0.2-SNAPSHOT-jar-with-dependencies.jar</org.apache.hadoop.giraph.zkJar>

          Show
          Avery Ching added a comment - Nice that you got it working with all the versions! One question though, why is the line below needed in pom.xml? <org.apache.hadoop.giraph.zkJar>giraph-0.2-SNAPSHOT-jar-with-dependencies.jar</org.apache.hadoop.giraph.zkJar>
          Hide
          Eugene Koontz added a comment -

          Avery, thanks for spotting that line. I may have been testing something and left it in accidentally. I've removed it and mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify still works. I'm attaching a new patch now.

          Show
          Eugene Koontz added a comment - Avery, thanks for spotting that line. I may have been testing something and left it in accidentally. I've removed it and mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify still works. I'm attaching a new patch now.
          Hide
          Eugene Koontz added a comment -

          -removes unneeded <org.apache.hadoop.giraph.zkJar> from Facebook profile
          -additional README content regarding maven profile usage

          mvn -Phadoop_non_secure clean verify && 
          mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify && 
          mvn -Phadoop_0.20.203 clean verify && 
          mvn clean verify && 
          mvn -Phadoop_0.23 clean verify && 
          mvn -Phadoop_trunk clean verify
          

          succeeds.

          Show
          Eugene Koontz added a comment - -removes unneeded <org.apache.hadoop.giraph.zkJar> from Facebook profile -additional README content regarding maven profile usage mvn -Phadoop_non_secure clean verify && mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify && mvn -Phadoop_0.20.203 clean verify && mvn clean verify && mvn -Phadoop_0.23 clean verify && mvn -Phadoop_trunk clean verify succeeds.
          Hide
          Avery Ching added a comment -

          +1. Given this is a somewhat large change, I'll wait until tonight to see if anyone opposes it. If not, I'll commit.

          Show
          Avery Ching added a comment - +1. Given this is a somewhat large change, I'll wait until tonight to see if anyone opposes it. If not, I'll commit.
          Hide
          Jakob Homan added a comment -
          mvn -Phadoop_non_secure clean verify && 
          mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify && 
          mvn -Phadoop_0.20.203 clean verify && 
          mvn clean verify && 
          mvn -Phadoop_0.23 clean verify && 
          mvn -Phadoop_trunk clean verif

          Probably not for this patch, but is there a way in Maven to cycle through these settings, testing each one of them, that doesn't require this BASH chaining? Something will cycle through all the profiles?

          Show
          Jakob Homan added a comment - mvn -Phadoop_non_secure clean verify && mvn -Phadoop_facebook -Dhadoop.jar.path=/Users/ekoontz/hadoop-20/build/hadoop-0.20.1-dev-core.jar clean verify && mvn -Phadoop_0.20.203 clean verify && mvn clean verify && mvn -Phadoop_0.23 clean verify && mvn -Phadoop_trunk clean verif Probably not for this patch, but is there a way in Maven to cycle through these settings, testing each one of them, that doesn't require this BASH chaining? Something will cycle through all the profiles?
          Hide
          Eugene Koontz added a comment -
          Show
          Eugene Koontz added a comment - Jakob, that would be nice to have. I found this: http://stackoverflow.com/questions/4932944/maven-build-multiple-profiles-in-one-go
          Hide
          Hudson added a comment -

          Integrated in Giraph-trunk-Commit #99 (See https://builds.apache.org/job/Giraph-trunk-Commit/99/)
          GIRAPH-168: Simplify munge directive usage with new munge flag
          HADOOP_SECURE (rather than HADOOP_FACEBOOK) and remove usage of
          HADOOP (ekoontz via aching). (Revision 1311583)

          Result = FAILURE
          aching : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1311583
          Files :

          • /incubator/giraph/trunk/CHANGELOG
          • /incubator/giraph/trunk/README
          • /incubator/giraph/trunk/pom.xml
          • /incubator/giraph/trunk/src/main/java/org/apache/giraph/bsp/ImmutableOutputCommitter.java
          • /incubator/giraph/trunk/src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java
          • /incubator/giraph/trunk/src/main/java/org/apache/giraph/comm/CommunicationsInterface.java
          • /incubator/giraph/trunk/src/main/java/org/apache/giraph/comm/RPCCommunications.java
          • /incubator/giraph/trunk/src/test/java/org/apache/giraph/TestBspBasic.java
          Show
          Hudson added a comment - Integrated in Giraph-trunk-Commit #99 (See https://builds.apache.org/job/Giraph-trunk-Commit/99/ ) GIRAPH-168 : Simplify munge directive usage with new munge flag HADOOP_SECURE (rather than HADOOP_FACEBOOK) and remove usage of HADOOP (ekoontz via aching). (Revision 1311583) Result = FAILURE aching : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1311583 Files : /incubator/giraph/trunk/CHANGELOG /incubator/giraph/trunk/README /incubator/giraph/trunk/pom.xml /incubator/giraph/trunk/src/main/java/org/apache/giraph/bsp/ImmutableOutputCommitter.java /incubator/giraph/trunk/src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java /incubator/giraph/trunk/src/main/java/org/apache/giraph/comm/CommunicationsInterface.java /incubator/giraph/trunk/src/main/java/org/apache/giraph/comm/RPCCommunications.java /incubator/giraph/trunk/src/test/java/org/apache/giraph/TestBspBasic.java
          Hide
          Avery Ching added a comment -

          Eugene, I committed your patch, which passed 'mvn verify', however, seems to have changed the way the Junit test report somehow.

          Here's the result after your patch (99)

          Recording test results
          No test report files were found. Configuration error?
          Build step 'Publish JUnit test result report' changed build result to FAILURE
          Updating GIRAPH-168
          Finished: FAILURE

          https://builds.apache.org/job/Giraph-trunk-Commit/99/

          The last commit seemed to have the JUnit test result reports just fine (https://builds.apache.org/job/Giraph-trunk-Commit/98/).

          Can you please take a look?

          Show
          Avery Ching added a comment - Eugene, I committed your patch, which passed 'mvn verify', however, seems to have changed the way the Junit test report somehow. Here's the result after your patch (99) Recording test results No test report files were found. Configuration error? Build step 'Publish JUnit test result report' changed build result to FAILURE Updating GIRAPH-168 Finished: FAILURE https://builds.apache.org/job/Giraph-trunk-Commit/99/ The last commit seemed to have the JUnit test result reports just fine ( https://builds.apache.org/job/Giraph-trunk-Commit/98/ ). Can you please take a look?
          Hide
          Avery Ching added a comment -

          Actually, I figured it out. Hudson tries to look for the test files in

          trunk/target/surefire-reports/TEST-*.xml

          I just changed the path to

          trunk/target/munged/surefire-reports/TEST-*.xml

          and reran the build successfully.

          Closing! Nice job!

          Show
          Avery Ching added a comment - Actually, I figured it out. Hudson tries to look for the test files in trunk/target/surefire-reports/TEST-*.xml I just changed the path to trunk/target/munged/surefire-reports/TEST-*.xml and reran the build successfully. Closing! Nice job!
          Hide
          Eugene Koontz added a comment -

          Avery, thanks a lot for figuring it out. What you said makes sense now. Hudson must be building the default profile, hadoop_0.20.203, which is using the munge plugin now. This is causing the reports to be placed in trunk/target/munged/surefire-reports.

          It would be nice, though, to be able to have Hudson run multiple profiles, as Jakob mentioned above.

          Show
          Eugene Koontz added a comment - Avery, thanks a lot for figuring it out. What you said makes sense now. Hudson must be building the default profile, hadoop_0.20.203, which is using the munge plugin now. This is causing the reports to be placed in trunk/target/munged/surefire-reports. It would be nice, though, to be able to have Hudson run multiple profiles, as Jakob mentioned above.
          Hide
          Avery Ching added a comment -

          I can modify Hudson to do execute the commands you used above. Any thoughts/comments?

          Show
          Avery Ching added a comment - I can modify Hudson to do execute the commands you used above. Any thoughts/comments?
          Hide
          Eugene Koontz added a comment -

          +1, it sounds great to me if we can run all the profiles through Hudson for every commit and patch submission. I'm sure it will catch some Hadoop inter-version incompatibilities that would not be seen if we only build the default profile.

          Show
          Eugene Koontz added a comment - +1, it sounds great to me if we can run all the profiles through Hudson for every commit and patch submission. I'm sure it will catch some Hadoop inter-version incompatibilities that would not be seen if we only build the default profile.
          Hide
          Eugene Koontz added a comment -

          Although I wonder if Hudson will be able to find the Facebook Hadoop jar? Would we need to add some repo information to the pom.xml to tell Hudson where it can find this jar?

          Show
          Eugene Koontz added a comment - Although I wonder if Hudson will be able to find the Facebook Hadoop jar? Would we need to add some repo information to the pom.xml to tell Hudson where it can find this jar?
          Hide
          Jakob Homan added a comment -

          Can we add the bash script above, where ever Maven decrees such things should go so it's quick and reproducible for Hudson or contributors. It's not as good as having the script itself test each profile, but it'll make things easier.

          Show
          Jakob Homan added a comment - Can we add the bash script above, where ever Maven decrees such things should go so it's quick and reproducible for Hudson or contributors. It's not as good as having the script itself test each profile, but it'll make things easier.
          Hide
          Avery Ching added a comment -

          I would ignore the facebook one for now (we can add it later), but I can try

          mvn -Phadoop_non_secure clean verify &&
          mvn -Phadoop_0.20.203 clean verify &&
          mvn clean verify &&
          mvn -Phadoop_0.23 clean verify &&
          mvn -Phadoop_trunk clean verify

          Show
          Avery Ching added a comment - I would ignore the facebook one for now (we can add it later), but I can try mvn -Phadoop_non_secure clean verify && mvn -Phadoop_0.20.203 clean verify && mvn clean verify && mvn -Phadoop_0.23 clean verify && mvn -Phadoop_trunk clean verify
          Hide
          Hudson added a comment -

          Integrated in Giraph-trunk-Commit #220 (See https://builds.apache.org/job/Giraph-trunk-Commit/220/)
          GIRAPH-212: Security is busted since GIRAPH-168 (Revision 1393814)

          Result = FAILURE
          aching : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1393814
          Files :

          • /giraph/trunk/CHANGELOG
          • /giraph/trunk/pom.xml
          • /giraph/trunk/src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java
          • /giraph/trunk/src/main/java/org/apache/giraph/comm/RPCCommunications.java
          • /giraph/trunk/src/main/java/org/apache/giraph/comm/SecureRPCCommunications.java
          • /giraph/trunk/src/main/java/org/apache/giraph/graph/BspServiceWorker.java
          • /giraph/trunk/src/test/java/org/apache/giraph/TestBspBasic.java
          Show
          Hudson added a comment - Integrated in Giraph-trunk-Commit #220 (See https://builds.apache.org/job/Giraph-trunk-Commit/220/ ) GIRAPH-212 : Security is busted since GIRAPH-168 (Revision 1393814) Result = FAILURE aching : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1393814 Files : /giraph/trunk/CHANGELOG /giraph/trunk/pom.xml /giraph/trunk/src/main/java/org/apache/giraph/comm/BasicRPCCommunications.java /giraph/trunk/src/main/java/org/apache/giraph/comm/RPCCommunications.java /giraph/trunk/src/main/java/org/apache/giraph/comm/SecureRPCCommunications.java /giraph/trunk/src/main/java/org/apache/giraph/graph/BspServiceWorker.java /giraph/trunk/src/test/java/org/apache/giraph/TestBspBasic.java
          Hide
          Eugene Koontz added a comment -

          Interesting tabular comparison of Hadoop API differences:

          http://dbeech.github.com/hadoop-api-evolution.html

          Show
          Eugene Koontz added a comment - Interesting tabular comparison of Hadoop API differences: http://dbeech.github.com/hadoop-api-evolution.html

            People

            • Assignee:
              Eugene Koontz
              Reporter:
              Eugene Koontz
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development