Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-9481

Broken conditional logic with HADOOP_SNAPPY_LIBRARY

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 2.1.0-beta, 3.0.0-alpha1
    • 2.1.0-beta
    • None
    • None
    • Reviewed

    Description

      The problem is a regression introduced by recent fix https://issues.apache.org/jira/browse/HADOOP-8562 .
      That fix makes some improvements for Windows platform, but breaks native code work on Unix.
      Namely, let's see the diff HADOOP-8562 of the file hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c :

      --- hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c
      +++ hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c
      @@ -16,12 +16,18 @@
        * limitations under the License.
        */
      
      -#include <dlfcn.h>
      +
      +#if defined HADOOP_SNAPPY_LIBRARY
      +
       #include <stdio.h>
       #include <stdlib.h>
       #include <string.h>
      
      +#ifdef UNIX
      +#include <dlfcn.h>
       #include "config.h"
      +#endif // UNIX
      +
       #include "org_apache_hadoop_io_compress_snappy.h"
       #include "org_apache_hadoop_io_compress_snappy_SnappyCompressor.h"
      
      @@ -81,7 +87,7 @@ JNIEXPORT jint JNICALL Java_org_apache_hadoop_io_compress_snappy_SnappyCompresso
         UNLOCK_CLASS(env, clazz, "SnappyCompressor");
      
         if (uncompressed_bytes == 0) {
      -    return 0;
      +    return (jint)0;
         }
      
         // Get the output direct buffer
      @@ -90,7 +96,7 @@ JNIEXPORT jint JNICALL Java_org_apache_hadoop_io_compress_snappy_SnappyCompresso
         UNLOCK_CLASS(env, clazz, "SnappyCompressor");
      
         if (compressed_bytes == 0) {
      -    return 0;
      +    return (jint)0;
         }
      
         /* size_t should always be 4 bytes or larger. */
      @@ -109,3 +115,5 @@ JNIEXPORT jint JNICALL Java_org_apache_hadoop_io_compress_snappy_SnappyCompresso
         (*env)->SetIntField(env, thisj, SnappyCompressor_uncompressedDirectBufLen, 0);
         return (jint)buf_len;
       }
      +
      +#endif //define HADOOP_SNAPPY_LIBRARY
      

      Here we see that all the class implementation got enclosed into "if defined HADOOP_SNAPPY_LIBRARY" directive, and the point is that "HADOOP_SNAPPY_LIBRARY" is not defined.
      This causes the class implementation to be effectively empty, what, in turn, causes the UnsatisfiedLinkError to be thrown in the runtime upon any attempt to invoke the native methods implemented there.
      The actual intention of the authors of HADOOP-8562 was (as we suppose) to invoke "include config.h", where "HADOOP_SNAPPY_LIBRARY" is defined. But currently it is not included because it resides inside "if defined HADOOP_SNAPPY_LIBRARY" block.

      Similar situation with "ifdef UNIX", because UNIX or WINDOWS variables are defined in "org_apache_hadoop.h", which is indirectly included through "include "org_apache_hadoop_io_compress_snappy.h"", and in the current code this is done after code "ifdef UNIX", so in the current code the block "ifdef UNIX" is not executed on UNIX.

      The suggested patch fixes the described problems by reordering the "include" and "if" preprocessor directives accordingly, bringing the methods of class org.apache.hadoop.io.compress.snappy.SnappyCompressor back to work again.

      Of course, Snappy native libraries must be installed to build and invoke snappy native methods.

      (Note: there was a mistype in commit message: 8952 written in place of 8562:
      HADOOP-8952. Enhancements to support Hadoop on Windows Server and Windows Azure environments. Contributed by Ivan Mitic, Chuan Liu, Ramya Sunil, Bikas Saha, Kanna Karanam, John Gordon, Brandon Li, Chris Nauroth, David Lao, Sumadhur Reddy Bolli, Arpit Agarwal, Ahmed El Baz, Mike Liddell, Jing Zhao, Thejas Nair, Steve Maine, Ganeshan Iyer, Raja Aluri, Giridharan Kesavan, Ramya Bharathi Nimmagadda.
      git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/trunk@1453486 13f79535-47bb-0310-9956-ffa450edef68
      )

      Attachments

        1. HADOOP-9481-trunk--N1.patch
          2 kB
          Vadim Bondarev
        2. HADOOP-9481-trunk--N4.patch
          2 kB
          Vadim Bondarev

        Issue Links

          Activity

            vbondarev Vadim Bondarev added a comment -

            Since commit 102f2e3ea007f8cda041f9df7358098482e5f3e2 in classes SnappyCompressor.c/SnappyDecompressor.c was added includes and conditional logic (if defined),
            But his order make classes (SnappyCompressor/SnappyDecompressor) with empty body after compilation in so file.
            This easy to prove with this test method:

            SnappyCompressor compressor = new SnappyCompressor();
            compressor.compress(compressed, 0, compressed.length);

            Call method compress() was fail with java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.compress.snappy.SnappyCompressor.compressBytesDirect()I

            vbondarev Vadim Bondarev added a comment - Since commit 102f2e3ea007f8cda041f9df7358098482e5f3e2 in classes SnappyCompressor.c/SnappyDecompressor.c was added includes and conditional logic (if defined), But his order make classes (SnappyCompressor/SnappyDecompressor) with empty body after compilation in so file. This easy to prove with this test method: SnappyCompressor compressor = new SnappyCompressor(); compressor.compress(compressed, 0, compressed.length); Call method compress() was fail with java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.compress.snappy.SnappyCompressor.compressBytesDirect()I
            vbondarev Vadim Bondarev added a comment - Its has affected on patch https://issues.apache.org/jira/browse/HADOOP-9225
            hadoopqa Hadoop QA added a comment -

            -1 overall. Here are the results of testing the latest attachment
            http://issues.apache.org/jira/secure/attachment/12579129/HADOOP-9481-trunk--N1.patch
            against trunk revision .

            +1 @author. The patch does not contain any @author tags.

            -1 tests included. The patch doesn't appear to include any new or modified tests.
            Please justify why no new tests are needed for this patch.
            Also please list what manual steps were performed to verify this patch.

            +1 javac. The applied patch does not increase the total number of javac compiler warnings.

            +1 javadoc. The javadoc tool did not generate any warning messages.

            +1 eclipse:eclipse. The patch built with eclipse:eclipse.

            +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

            +1 release audit. The applied patch does not increase the total number of release audit warnings.

            +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

            +1 contrib tests. The patch passed contrib unit tests.

            Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/2452//testReport/
            Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/2452//console

            This message is automatically generated.

            hadoopqa Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12579129/HADOOP-9481-trunk--N1.patch against trunk revision . +1 @author . The patch does not contain any @author tags. -1 tests included . The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . The javadoc tool did not generate any warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/2452//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/2452//console This message is automatically generated.
            cmccabe Colin McCabe added a comment -

            I don't understand what problem is being fixed here, even after looking at the patch. Can you add a description?

            Also, have you installed snappy locally? If you have not, compiling without snappy is the correct course of action, and not a problem.

            cmccabe Colin McCabe added a comment - I don't understand what problem is being fixed here, even after looking at the patch. Can you add a description? Also, have you installed snappy locally? If you have not, compiling without snappy is the correct course of action, and not a problem.

            The patch verification complains about not added/changed tests. We don't add tests there in this patch since comprehensive tests for Snappy are suggested in another patch: https://issues.apache.org/jira/browse/HADOOP-9225 . So, these 2 fixes can be reviewed and applied together.

            iveselovsky Ivan A. Veselovsky added a comment - The patch verification complains about not added/changed tests. We don't add tests there in this patch since comprehensive tests for Snappy are suggested in another patch: https://issues.apache.org/jira/browse/HADOOP-9225 . So, these 2 fixes can be reviewed and applied together.
            cnauroth Chris Nauroth added a comment -

            Hi, Vadim. I thought cmake was taking care of defining HADOOP_SNAPPY_LIBRARY when building with Snappy. (See src/CMakeLists.txt and src/config.h.cmake.) I'm curious if the cmake piece is somehow not working correctly for your builds.

            cnauroth Chris Nauroth added a comment - Hi, Vadim. I thought cmake was taking care of defining HADOOP_SNAPPY_LIBRARY when building with Snappy. (See src/CMakeLists.txt and src/config.h.cmake.) I'm curious if the cmake piece is somehow not working correctly for your builds.

            Hi, Chris,
            what version of cmake do you use? We use cmake 2.8.8.

            iveselovsky Ivan A. Veselovsky added a comment - Hi, Chris, what version of cmake do you use? We use cmake 2.8.8.

            In our experiments "config.h" is generated by cmake and its content is okay.
            The problem is that "config.h" is not included into "hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c" because "if defined HADOOP_SNAPPY_LIBRARY" condition is false (see desription).

            iveselovsky Ivan A. Veselovsky added a comment - In our experiments "config.h" is generated by cmake and its content is okay. The problem is that "config.h" is not included into "hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c" because "if defined HADOOP_SNAPPY_LIBRARY" condition is false (see desription).
            cnauroth Chris Nauroth added a comment -

            Ivan, thank you for the further explanation. I understand the problem now.

            I think a simpler fix would be to add #include "org_apache_hadoop.h" as the first line of SnappyCompressor.c and SnappyDecompressor.c. Can you check if that would work?

            Right now, this gets included transitively through #include "org_apache_hadoop_io_compress_snappy.h", but that's too late. Including it explicitly as the first line would guarantee that UNIX or WINDOWS gets defined before any other header or code processing. This also would guarantee that config.h gets included for UNIX, and therefore HADOOP_SNAPPY_LIBRARY would be defined. This is the same approach used in NativeIO.c.

            In general, both the .h and .c files are structured such that they expect UNIX, WINDOWS, HADOOP_SNAPPY_LIBRARY, and other build configuration dependencies to be defined before the preprocessor handles anything else. Therefore, I think it's best that we always #include "org_apache_hadoop.h" as the first line in any file.

            BTW, I have build environments for both Linux and Windows ready to go, so I can volunteer to test the patch cross-platform after we resolve this feedback. I don't have any test jobs ready to go that use Snappy, so I can't fully verify that, but I assume you already tested that part before submitting the patch.

            Thank you for addressing this!

            cnauroth Chris Nauroth added a comment - Ivan, thank you for the further explanation. I understand the problem now. I think a simpler fix would be to add #include "org_apache_hadoop.h" as the first line of SnappyCompressor.c and SnappyDecompressor.c. Can you check if that would work? Right now, this gets included transitively through #include "org_apache_hadoop_io_compress_snappy.h" , but that's too late. Including it explicitly as the first line would guarantee that UNIX or WINDOWS gets defined before any other header or code processing. This also would guarantee that config.h gets included for UNIX , and therefore HADOOP_SNAPPY_LIBRARY would be defined. This is the same approach used in NativeIO.c. In general, both the .h and .c files are structured such that they expect UNIX , WINDOWS , HADOOP_SNAPPY_LIBRARY , and other build configuration dependencies to be defined before the preprocessor handles anything else. Therefore, I think it's best that we always #include "org_apache_hadoop.h" as the first line in any file. BTW, I have build environments for both Linux and Windows ready to go, so I can volunteer to test the patch cross-platform after we resolve this feedback. I don't have any test jobs ready to go that use Snappy, so I can't fully verify that, but I assume you already tested that part before submitting the patch. Thank you for addressing this!
            vbondarev Vadim Bondarev added a comment -

            Chris, yes it's looks simpler. I add #include "org_apache_hadoop.h" as the first line of SnappyCompressor.c/SnappyDecompressor.c. and check for work.
            Test methods in TestSnappyCompressorDecompressor was passed. New version patch in attachments.

            vbondarev Vadim Bondarev added a comment - Chris, yes it's looks simpler. I add #include "org_apache_hadoop.h" as the first line of SnappyCompressor.c/SnappyDecompressor.c. and check for work. Test methods in TestSnappyCompressorDecompressor was passed. New version patch in attachments.
            vbondarev Vadim Bondarev added a comment -

            TestSnappyCompressorDecompressor it's a class from https://issues.apache.org/jira/browse/HADOOP-9225

            vbondarev Vadim Bondarev added a comment - TestSnappyCompressorDecompressor it's a class from https://issues.apache.org/jira/browse/HADOOP-9225
            hadoopqa Hadoop QA added a comment -

            -1 overall. Here are the results of testing the latest attachment
            http://issues.apache.org/jira/secure/attachment/12579801/HADOOP-9481-trunk--N4.patch
            against trunk revision .

            +1 @author. The patch does not contain any @author tags.

            -1 tests included. The patch doesn't appear to include any new or modified tests.
            Please justify why no new tests are needed for this patch.
            Also please list what manual steps were performed to verify this patch.

            +1 javac. The applied patch does not increase the total number of javac compiler warnings.

            +1 javadoc. The javadoc tool did not generate any warning messages.

            +1 eclipse:eclipse. The patch built with eclipse:eclipse.

            +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

            +1 release audit. The applied patch does not increase the total number of release audit warnings.

            +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

            +1 contrib tests. The patch passed contrib unit tests.

            Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/2465//testReport/
            Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/2465//console

            This message is automatically generated.

            hadoopqa Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12579801/HADOOP-9481-trunk--N4.patch against trunk revision . +1 @author . The patch does not contain any @author tags. -1 tests included . The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . The javadoc tool did not generate any warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/2465//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/2465//console This message is automatically generated.
            cnauroth Chris Nauroth added a comment -

            +1 for the patch. (Including org_apache_hadoop_io_compress_snappy.h first transitively includes org_apache_hadoop.h first, so it has the same effect as my earlier suggestion.) I verified successful builds on Ubuntu and Windows. Thank you for the pointer to the new test on HADOOP-9225. I ran that successfully on the Ubuntu machine.

            Thank you, Vadim!

            cnauroth Chris Nauroth added a comment - +1 for the patch. (Including org_apache_hadoop_io_compress_snappy.h first transitively includes org_apache_hadoop.h first, so it has the same effect as my earlier suggestion.) I verified successful builds on Ubuntu and Windows. Thank you for the pointer to the new test on HADOOP-9225 . I ran that successfully on the Ubuntu machine. Thank you, Vadim!
            cmccabe Colin McCabe added a comment -

            thanks for the explanation. I would prefer that you explicitly include config.h prior to doing the test for HADOOP_SNAPPY_LIBRARY, rather than relying on a chain of includes. aside from that, looks good to me.

            cmccabe Colin McCabe added a comment - thanks for the explanation. I would prefer that you explicitly include config.h prior to doing the test for HADOOP_SNAPPY_LIBRARY , rather than relying on a chain of includes. aside from that, looks good to me.
            vbondarev Vadim Bondarev added a comment -

            First patch version do explicitly include config.h prior to doing the test for HADOOP_SNAPPY_LIBRARY, you can use it more preferable

            vbondarev Vadim Bondarev added a comment - First patch version do explicitly include config.h prior to doing the test for HADOOP_SNAPPY_LIBRARY, you can use it more preferable
            atm Aaron Myers added a comment -

            Colin told me offline that he doesn't mind the way the current patch is re: the chain of includes.

            +1, the latest patch looks good to me. I confirmed that this fixes the Hadoop snappy build on my box.

            I'm going to commit this momentarily.

            atm Aaron Myers added a comment - Colin told me offline that he doesn't mind the way the current patch is re: the chain of includes. +1, the latest patch looks good to me. I confirmed that this fixes the Hadoop snappy build on my box. I'm going to commit this momentarily.
            atm Aaron Myers added a comment -

            I've just committed this to trunk. Thanks a lot for the contribution, Vadim, and thanks also to Colin for the reviews.

            atm Aaron Myers added a comment - I've just committed this to trunk. Thanks a lot for the contribution, Vadim, and thanks also to Colin for the reviews.
            atm Aaron Myers added a comment -

            (Whoops, thanks also to Chris for the review as well.)

            atm Aaron Myers added a comment - (Whoops, thanks also to Chris for the review as well.)
            hudson Hudson added a comment -

            Integrated in Hadoop-trunk-Commit #3740 (See https://builds.apache.org/job/Hadoop-trunk-Commit/3740/)
            HADOOP-9481. Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191)

            Result = SUCCESS
            atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment - Integrated in Hadoop-trunk-Commit #3740 (See https://builds.apache.org/job/Hadoop-trunk-Commit/3740/ ) HADOOP-9481 . Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191) Result = SUCCESS atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment -

            Integrated in Hadoop-Yarn-trunk #206 (See https://builds.apache.org/job/Hadoop-Yarn-trunk/206/)
            HADOOP-9481. Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191)

            Result = SUCCESS
            atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment - Integrated in Hadoop-Yarn-trunk #206 (See https://builds.apache.org/job/Hadoop-Yarn-trunk/206/ ) HADOOP-9481 . Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191) Result = SUCCESS atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment -

            Integrated in Hadoop-Hdfs-trunk #1395 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1395/)
            HADOOP-9481. Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191)

            Result = FAILURE
            atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment - Integrated in Hadoop-Hdfs-trunk #1395 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1395/ ) HADOOP-9481 . Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191) Result = FAILURE atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment -

            Integrated in Hadoop-Mapreduce-trunk #1422 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1422/)
            HADOOP-9481. Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191)

            Result = SUCCESS
            atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c
            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #1422 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1422/ ) HADOOP-9481 . Broken conditional logic with HADOOP_SNAPPY_LIBRARY. Contributed by Vadim Bondarev. (Revision 1481191) Result = SUCCESS atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1481191 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c
            hudson Hudson added a comment -

            Integrated in Hadoop-trunk-Commit #3851 (See https://builds.apache.org/job/Hadoop-trunk-Commit/3851/)
            HADOOP-9481. Move from trunk to release 2.1.0 section (Revision 1489261)

            Result = SUCCESS
            suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment - Integrated in Hadoop-trunk-Commit #3851 (See https://builds.apache.org/job/Hadoop-trunk-Commit/3851/ ) HADOOP-9481 . Move from trunk to release 2.1.0 section (Revision 1489261) Result = SUCCESS suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment -

            Integrated in Hadoop-Yarn-trunk #230 (See https://builds.apache.org/job/Hadoop-Yarn-trunk/230/)
            HADOOP-9481. Move from trunk to release 2.1.0 section (Revision 1489261)

            Result = FAILURE
            suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment - Integrated in Hadoop-Yarn-trunk #230 (See https://builds.apache.org/job/Hadoop-Yarn-trunk/230/ ) HADOOP-9481 . Move from trunk to release 2.1.0 section (Revision 1489261) Result = FAILURE suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment -

            Integrated in Hadoop-Hdfs-trunk #1420 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1420/)
            HADOOP-9481. Move from trunk to release 2.1.0 section (Revision 1489261)

            Result = FAILURE
            suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment - Integrated in Hadoop-Hdfs-trunk #1420 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1420/ ) HADOOP-9481 . Move from trunk to release 2.1.0 section (Revision 1489261) Result = FAILURE suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment -

            Integrated in Hadoop-Mapreduce-trunk #1446 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1446/)
            HADOOP-9481. Move from trunk to release 2.1.0 section (Revision 1489261)

            Result = SUCCESS
            suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261
            Files :

            • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
            hudson Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #1446 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1446/ ) HADOOP-9481 . Move from trunk to release 2.1.0 section (Revision 1489261) Result = SUCCESS suresh : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1489261 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt

            People

              vbondarev Vadim Bondarev
              vbondarev Vadim Bondarev
              Votes:
              0 Vote for this issue
              Watchers:
              12 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: