HBase
  1. HBase
  2. HBASE-3873

Mavenize Hadoop Snappy JAR/SOs project dependencies

    Details

    • Type: Improvement Improvement
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.90.2
    • Fix Version/s: 0.92.0
    • Component/s: build
    • Labels:
    • Environment:

      Linux

    • Hadoop Flags:
      Reviewed
    • Release Note:
      Add support to hbase for snappy compression.

      Description

      (This JIRA builds on HBASE-3691)

      I'm working on simplifying how to use Hadoop Snappy from other based maven projects. The idea is that hadoop-snappy JAR and the SOs (snappy and hadoop-snappy) would be picked up from a Maven repository (like any other dependencies). SO files will be picked up based on the architecture where the build is running (32 or 64 bits).

      For Hbase this would remove the need to manually copy snappy JAR and SOs (snappy and hadoop-snappy) into HADOOP_HOME/lib or HBASE_HOME/lib and hadoop-snappy would be handled as a regular maven dependency (with a trick for the SOs file).

      The changes would affect only the pom.xml and the would be in a 'snappy' profile, thus requiring '-Dsnappy' option in Maven invocations to trigger the including of snappy JAR and SOs.

      Because hadoop-snappy (JAR and SOs) are not currently avail in public Maven repos, until that happens, Hbase developer would have to checkout and 'mvn install' hadoop-snappy. Which is (IMO) simpler than what will have to be done in once HBASE-3691 is committed.

      1. HBASE-3873.patch
        5 kB
        Alejandro Abdelnur
      2. HBASE-3873.patch
        6 kB
        Alejandro Abdelnur

        Issue Links

          Activity

          Hide
          Francisco Cruz added a comment -

          Hi all, I am having the following problem when building the hadoop snappy:

          [exec] config.status: executing libtool commands
          [exec] depbase=`echo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
          [exec] /bin/bash ./lsrc/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function 'Java_org_apache_hadoop_io_compress_snappy_SnappyCompreibtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/java-7-oracle/include -I/usr/lib/jvm/java-7-oracle/inclssor_initIDs':
          [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64:49: error: expected expression before ',' token
          [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function 'Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect':
          [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117:3: warning: passing argument 4 of 'dlsym_snappy_compress' from incompatible pointer type [enabled by default]
          [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117:3: note: expected 'size_t *' but argument is of type 'jint *'
          [exec] make: *** [src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo] Error 1
          [exec] ude/linux -I/home/gsd/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/usr/local//include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF $depbase.Tpo -c -o src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c &&\
          [exec] mv -f $depbase.Tpo $depbase.Plo
          [exec] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/java-7-oracle/include -I/usr/lib/jvm/java-7-oracle/include/linux -I/home/gsd/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/usr/local//include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF src/org/apache/hadoop/io/compress/snappy/.deps/SnappyCompressor.Tpo -c src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c -fPIC -DPIC -o src/org/apache/hadoop/io/compress/snappy/.libs/SnappyCompressor.o

          Does anyone have any idea why is this happening?
          Thanks for your help

          Show
          Francisco Cruz added a comment - Hi all, I am having the following problem when building the hadoop snappy: [exec] config.status: executing libtool commands [exec] depbase=`echo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo | sed 's| [^/] *$|.deps/&|;s|\.lo$||'`;\ [exec] /bin/bash ./lsrc/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function 'Java_org_apache_hadoop_io_compress_snappy_SnappyCompreibtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/java-7-oracle/include -I/usr/lib/jvm/java-7-oracle/inclssor_initIDs': [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64:49: error: expected expression before ',' token [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function 'Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect': [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117:3: warning: passing argument 4 of 'dlsym_snappy_compress' from incompatible pointer type [enabled by default] [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117:3: note: expected 'size_t *' but argument is of type 'jint *' [exec] make: *** [src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo] Error 1 [exec] ude/linux -I/home/gsd/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/usr/local//include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF $depbase.Tpo -c -o src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c &&\ [exec] mv -f $depbase.Tpo $depbase.Plo [exec] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/java-7-oracle/include -I/usr/lib/jvm/java-7-oracle/include/linux -I/home/gsd/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/usr/local//include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF src/org/apache/hadoop/io/compress/snappy/.deps/SnappyCompressor.Tpo -c src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c -fPIC -DPIC -o src/org/apache/hadoop/io/compress/snappy/.libs/SnappyCompressor.o Does anyone have any idea why is this happening? Thanks for your help
          Hide
          stack added a comment -

          @Chul Can you make a new issue with a patch? I'll do doc changes... Thanks.

          Show
          stack added a comment - @Chul Can you make a new issue with a patch? I'll do doc changes... Thanks.
          Hide
          Chul Kwon added a comment -

          I figured out the problem.
          When you give -Dsnappy.prefix=$SNAPPY_BUILD_DIR option, the builder tries to find libjvm.so in the Snappy directory, not at $JAVA_HOME/jre/lib/{$OS_ARCH}/server.
          Therefore, you need to either fix the source code or simply make a symlink in the $SNAPPY_BUILD_DIR to your libjvm.so

          Show
          Chul Kwon added a comment - I figured out the problem. When you give -Dsnappy.prefix=$SNAPPY_BUILD_DIR option, the builder tries to find libjvm.so in the Snappy directory, not at $JAVA_HOME/jre/lib/{$OS_ARCH}/server. Therefore, you need to either fix the source code or simply make a symlink in the $SNAPPY_BUILD_DIR to your libjvm.so
          Hide
          Chul Kwon added a comment -

          After

           mvn install -Dsnappy.prefix=(absolute-path to Snappy) 

          , I get the following error:

          
          compilenative:
          
               ...
          
               [exec] configure: creating ./config.status
               [exec] config.status: creating Makefile
               [exec] config.status: creating config.h
               [exec] config.status: config.h is unchanged
               [exec] config.status: executing depfiles commands
               [exec] config.status: executing libtool commands
               [exec] /bin/bash ./libtool --tag=CC   --mode=link gcc -g -Wall -fPIC -O2 -m64 -g -O2 -version-info 0:1:0 -L/home/ngc/Char/snap102/snappy-1.0.2/build/usr/local/lib -o libhadoopsnappy.la -rpath /usr/local/lib src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.lo  -ljvm -ldl 
               [exec] libtool: link: gcc -shared  src/org/apache/hadoop/io/compress/snappy/.libs/SnappyCompressor.o src/org/apache/hadoop/io/compress/snappy/.libs/SnappyDecompressor.o   -L/home/ngc/Char/snap102/snappy-1.0.2/build/usr/local/lib -ljvm -ldl  -m64   -Wl,-soname -Wl,libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1
               [exec] /usr/bin/ld: cannot find -ljvm
               [exec] collect2: ld returned 1 exit status
               [exec] make: *** [libhadoopsnappy.la] Error 1
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 14.624s
          [INFO] Finished at: Wed Aug 10 11:51:09 EDT 2011
          [INFO] Final Memory: 6M/361M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile) on project hadoop-snappy: An Ant BuildException has occured: The following error occurred while executing this line:
          [ERROR] /home/ngc/Char/snap102/snappy-1.0.2/hadoop-snappy-read-only/maven/build-compilenative.xml:75: exec returned: 2
          [ERROR] -> [Help 1]
          
          

          I'm aware that this issue is similar to what Stack experienced (03/Jun/11 03:29), so I've tried everything that you and others at p/hadoop-snappy suggested. However, my Hadoop-Snapppy is the latest subversion, and my problem persists ever after I replaced

           LDFLAGS=$ldflags_bak 

          with

           #LDFLAGS=$ldflags_bak 

          .

          Unlike many experts like you and others in JIRA, I'm an undergraduate who just got into the world of Hadoop and HBase, and therefore does not have extensive knowledge of what goes on in background. I'd really appreciate if you could help me with this issue.

          Show
          Chul Kwon added a comment - After mvn install -Dsnappy.prefix=(absolute-path to Snappy) , I get the following error: compilenative: ... [exec] configure: creating ./config.status [exec] config.status: creating Makefile [exec] config.status: creating config.h [exec] config.status: config.h is unchanged [exec] config.status: executing depfiles commands [exec] config.status: executing libtool commands [exec] /bin/bash ./libtool --tag=CC --mode=link gcc -g -Wall -fPIC -O2 -m64 -g -O2 -version-info 0:1:0 -L/home/ngc/Char/snap102/snappy-1.0.2/build/usr/local/lib -o libhadoopsnappy.la -rpath /usr/local/lib src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.lo -ljvm -ldl [exec] libtool: link: gcc -shared src/org/apache/hadoop/io/compress/snappy/.libs/SnappyCompressor.o src/org/apache/hadoop/io/compress/snappy/.libs/SnappyDecompressor.o -L/home/ngc/Char/snap102/snappy-1.0.2/build/usr/local/lib -ljvm -ldl -m64 -Wl,-soname -Wl,libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1 [exec] /usr/bin/ld: cannot find -ljvm [exec] collect2: ld returned 1 exit status [exec] make: *** [libhadoopsnappy.la] Error 1 [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 14.624s [INFO] Finished at: Wed Aug 10 11:51:09 EDT 2011 [INFO] Final Memory: 6M/361M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile) on project hadoop-snappy: An Ant BuildException has occured: The following error occurred while executing this line: [ERROR] /home/ngc/Char/snap102/snappy-1.0.2/hadoop-snappy-read-only/maven/build-compilenative.xml:75: exec returned: 2 [ERROR] -> [Help 1] – I'm aware that this issue is similar to what Stack experienced (03/Jun/11 03:29), so I've tried everything that you and others at p/hadoop-snappy suggested. However, my Hadoop-Snapppy is the latest subversion, and my problem persists ever after I replaced LDFLAGS=$ldflags_bak with #LDFLAGS=$ldflags_bak . Unlike many experts like you and others in JIRA, I'm an undergraduate who just got into the world of Hadoop and HBase, and therefore does not have extensive knowledge of what goes on in background. I'd really appreciate if you could help me with this issue.
          Hide
          Hudson added a comment -

          Integrated in HBase-TRUNK #1954 (See https://builds.apache.org/job/HBase-TRUNK/1954/)
          HBASE-3873 Mavenize Hadoop Snappy JAR/SOs project dependencies

          stack :
          Files :

          • /hbase/trunk/src/test/java/org/apache/hadoop/hbase/util/TestCompressionTest.java
          • /hbase/trunk/CHANGES.txt
          • /hbase/trunk/pom.xml
          • /hbase/trunk/src/docbkx/build.xml
          • /hbase/trunk/src/assembly/all.xml
          • /hbase/trunk/src/docbkx/book.xml
          Show
          Hudson added a comment - Integrated in HBase-TRUNK #1954 (See https://builds.apache.org/job/HBase-TRUNK/1954/ ) HBASE-3873 Mavenize Hadoop Snappy JAR/SOs project dependencies stack : Files : /hbase/trunk/src/test/java/org/apache/hadoop/hbase/util/TestCompressionTest.java /hbase/trunk/CHANGES.txt /hbase/trunk/pom.xml /hbase/trunk/src/docbkx/build.xml /hbase/trunk/src/assembly/all.xml /hbase/trunk/src/docbkx/book.xml
          Hide
          stack added a comment -

          Committed to TRUNK. Thanks for the patch Alejandro (I updated doc. in book to suit). Sweet.

          Show
          stack added a comment - Committed to TRUNK. Thanks for the patch Alejandro (I updated doc. in book to suit). Sweet.
          Hide
          Alejandro Abdelnur added a comment -

          Thanks Stack. the cdh reference is a typo, yes please remove it before committing it. About bundling it with HBase distro, yes, it makes sense. And good that you are taking on the the docs.

          I'll look at that error on Monday.

          Show
          Alejandro Abdelnur added a comment - Thanks Stack. the cdh reference is a typo, yes please remove it before committing it. About bundling it with HBase distro, yes, it makes sense. And good that you are taking on the the docs. I'll look at that error on Monday.
          Hide
          stack added a comment -

          Much smoother running. This is great. Do you think we should actually ship with snappy binaries, now its so easy to produce them (Your patch has a cdh reference in it but easy fix which I can make on commit)? Asking because I want to commit a bit of documentation on this new feature at same time. Thanks A.

          Here is something I ran into that you might be interested in building hadoop-snappy.

          After a clean build, I tried to rerun it and its weird that clean can't delete the target dir... you know what that is about? (I can do a rm -rf on the target dir as myself)

          mvn clean install:

          [INFO] ------------------------------------------------------------------------
          [ERROR] BUILD ERROR
          [INFO] ------------------------------------------------------------------------
          [INFO] Failed to delete directory: /home/stack/hadoop-snappy-read-only/target. Reason: Unable to delete directory /home/stack/hadoop-snappy-read-only/target/hadoop-snappy-0.0.1-SNAPSHOT-tar/hadoop-snappy-0.0.1-SNAPSHOT/lib/native/Linux-amd64-64
          
          Show
          stack added a comment - Much smoother running. This is great. Do you think we should actually ship with snappy binaries, now its so easy to produce them (Your patch has a cdh reference in it but easy fix which I can make on commit)? Asking because I want to commit a bit of documentation on this new feature at same time. Thanks A. Here is something I ran into that you might be interested in building hadoop-snappy. After a clean build, I tried to rerun it and its weird that clean can't delete the target dir... you know what that is about? (I can do a rm -rf on the target dir as myself) mvn clean install: [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Failed to delete directory: /home/stack/hadoop-snappy-read-only/target. Reason: Unable to delete directory /home/stack/hadoop-snappy-read-only/target/hadoop-snappy-0.0.1-SNAPSHOT-tar/hadoop-snappy-0.0.1-SNAPSHOT/lib/ native /Linux-amd64-64
          Hide
          Alejandro Abdelnur added a comment -

          The attached patch takes care of including the snappy & hadoop-snappy SO files in the TARBALL if the -Dsnappy option was given to maven. For example:

          $ mvn package -Dsnappy
          

          The SO files are in the lib/native/$

          {ARCH}

          directory (the hbase scripts add that dir to the LD_PATH same as Hadoop thanks for the tip Todd).

          The assembly has been changed to generate an expanded directory instead the TARBALL. And the antrun plugin has been wired to the package phase to create the TARBALL using Unix tar command.

          This change is needed to preserve symlinks (Maven assembly is dumb when it comes to symlinks).

          Show
          Alejandro Abdelnur added a comment - The attached patch takes care of including the snappy & hadoop-snappy SO files in the TARBALL if the -Dsnappy option was given to maven. For example: $ mvn package -Dsnappy The SO files are in the lib/native/$ {ARCH} directory (the hbase scripts add that dir to the LD_PATH same as Hadoop thanks for the tip Todd ). The assembly has been changed to generate an expanded directory instead the TARBALL. And the antrun plugin has been wired to the package phase to create the TARBALL using Unix tar command. This change is needed to preserve symlinks (Maven assembly is dumb when it comes to symlinks).
          Hide
          Alejandro Abdelnur added a comment -

          Stack,

          I don't know how you ended up with an .svn directory under the hadoop-snappy target directory.

          My steps to build/M2-install hadoop-snappy are:

          • Download & expand snappy 1.0.2 tar
          • cd to snappy expanded dir (./snappy-1.0.2)
          • Build snappy 1.0.2 (./configure;make install DESTDIR=`pwd`/build) (make sure to use a full path in DESTDIR)
          • svn checkout hadoop-snappy
          • cd to hadoop-snappy dir (./hadoop-snappy-read-only)
          • mvn install -Dsnappy.prefix=$ {SNAPPY_SRC}

            /build/usr/local

          Show
          Alejandro Abdelnur added a comment - Stack, I don't know how you ended up with an .svn directory under the hadoop-snappy target directory. My steps to build/M2-install hadoop-snappy are: Download & expand snappy 1.0.2 tar cd to snappy expanded dir (./snappy-1.0.2) Build snappy 1.0.2 (./configure;make install DESTDIR=`pwd`/build) (make sure to use a full path in DESTDIR) svn checkout hadoop-snappy cd to hadoop-snappy dir (./hadoop-snappy-read-only) mvn install -Dsnappy.prefix=$ {SNAPPY_SRC} /build/usr/local
          Hide
          Alejandro Abdelnur added a comment -

          The patch is missing adding the snappy/hadoop-snappy native libs to lib/native/$ARCH/

          Show
          Alejandro Abdelnur added a comment - The patch is missing adding the snappy/hadoop-snappy native libs to lib/native/$ARCH/
          Hide
          Alejandro Abdelnur added a comment -

          Stack,

          I'll look into those .svn files under build.

          On your second error, please try updating your hadoop-snappy, this as been fixed yesterday.

          Thxs.

          Show
          Alejandro Abdelnur added a comment - Stack, I'll look into those .svn files under build. On your second error, please try updating your hadoop-snappy, this as been fixed yesterday. Thxs.
          Hide
          stack added a comment -

          @Alejandro

          Yeah, I already have them installed – thats whats odd:

          stack@sv4borg231:~/hadoop-snappy-read-only$ sudo apt-get install autotools-dev
          Reading package lists... Done
          Building dependency tree       
          Reading state information... Done
          autotools-dev is already the newest version.
          0 upgraded, 0 newly installed, 0 to remove and 2 not upgraded.
          stack@sv4borg231:~/hadoop-snappy-read-only$ mvn package
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hadoop Snappy
          [INFO]    task-segment: [package]
          [INFO] ------------------------------------------------------------------------
          [INFO] [resources:resources]
          [INFO] Using default encoding to copy filtered resources.
          [INFO] [compiler:compile]
          [INFO] Compiling 4 source files to /home/stack/hadoop-snappy-read-only/target/classes
          [INFO] [antrun:run {execution: compile}]
          [INFO] Executing tasks
          
          main:
          
          checkpreconditions:
          
          compilenative:
              [mkdir] Created dir: /home/stack/hadoop-snappy-read-only/target/native-src/config
              [mkdir] Created dir: /home/stack/hadoop-snappy-read-only/target/native-src/m4
               [exec] Can't exec "libtoolize": No such file or directory at /usr/bin/autoreconf line 188.
               [exec] Use of uninitialized value $libtoolize in pattern match (m//) at /usr/bin/autoreconf line 188.
               [exec] Can't exec "aclocal": No such file or directory at /usr/share/autoconf/Autom4te/FileUtils.pm line 326.
               [exec] autoreconf: failed to run aclocal: No such file or directory
          [INFO] ------------------------------------------------------------------------
          [ERROR] BUILD ERROR
          [INFO] ------------------------------------------------------------------------
          [INFO] An Ant BuildException has occured: The following error occurred while executing this line:
          /home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:62: exec returned: 1
          
          [INFO] ------------------------------------------------------------------------
          [INFO] For more information, run Maven with the -e switch
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 6 seconds
          [INFO] Finished at: Thu Jun 02 20:11:01 PDT 2011
          [INFO] Final Memory: 32M/677M
          [INFO] ------------------------------------------------------------------------
          

          If I install libtool I get further....If i reinstall automake I get further still.....

          I have to set JAVA_HOME it seems...

          If I redo the build I run into this issue:

          ...
          compilenative:
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/format': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/entries': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/SnappyCompressor.cc.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/org_apache_hadoop_io_compress_snappy_SnappyCompressor.h.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/SnappyDecompressor.cc.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/hadoop_snappy.h.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/org_apache_hadoop_io_compress_snappy_SnappyDecompressor.h.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/all-wcprops': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/format': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/entries': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/text-base/configure.ac.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/text-base/packageNativeHadoop.sh.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/text-base/Makefile.am.svn-base': Permission denied
               [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/all-wcprops': Permission denied
          [INFO] ------------------------------------------------------------------------
          [ERROR] BUILD ERROR
          [INFO] ------------------------------------------------------------------------
          [INFO] An Ant BuildException has occured: The following error occurred while executing this line:
          /home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:54: exec returned: 1
          

          So I have to manually remove the target dir.....

          Now I'm having same as this issue: http://code.google.com/p/hadoop-snappy/issues/detail?id=2

               [exec] /usr/bin/ld: cannot find -ljvm
               [exec] collect2: ld returned 1 exit status
               [exec] make: *** [libhadoopsnappy.la] Error 1
               [exec] libtool: link: g++ -shared -nostdlib /usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib/crti.o /usr/lib/gcc/x86_64-linux-gnu/4.3.3/crtbeginS.o  src/.libs/SnappyCompressor.o src/.libs/SnappyDecompressor.o   -L/usr/local/lib /usr/local/lib/libsnappy.so -ljvm -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3 -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../.. -lstdc++ -lm -lc -lgcc_s /usr/lib/gcc/x86_64-linux-gnu/4.3.3/crtendS.o /usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib/crtn.o    -Wl,-soname -Wl,libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1
          [INFO] ------------------------------------------------------------------------
          [ERROR] BUILD ERROR
          [INFO] ------------------------------------------------------------------------
          [INFO] An Ant BuildException has occured: The following error occurred while executing this line:
          /home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:91: exec returned: 2
          ...
          
          

          Thought you might be interested in my experience Alejandro.

          Good stuff.

          Show
          stack added a comment - @Alejandro Yeah, I already have them installed – thats whats odd: stack@sv4borg231:~/hadoop-snappy-read-only$ sudo apt-get install autotools-dev Reading package lists... Done Building dependency tree Reading state information... Done autotools-dev is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 2 not upgraded. stack@sv4borg231:~/hadoop-snappy-read-only$ mvn package [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Building Hadoop Snappy [INFO] task-segment: [ package ] [INFO] ------------------------------------------------------------------------ [INFO] [resources:resources] [INFO] Using default encoding to copy filtered resources. [INFO] [compiler:compile] [INFO] Compiling 4 source files to /home/stack/hadoop-snappy-read-only/target/classes [INFO] [antrun:run {execution: compile}] [INFO] Executing tasks main: checkpreconditions: compilenative: [mkdir] Created dir: /home/stack/hadoop-snappy-read-only/target/ native -src/config [mkdir] Created dir: /home/stack/hadoop-snappy-read-only/target/ native -src/m4 [exec] Can't exec "libtoolize" : No such file or directory at /usr/bin/autoreconf line 188. [exec] Use of uninitialized value $libtoolize in pattern match (m //) at /usr/bin/autoreconf line 188. [exec] Can't exec "aclocal" : No such file or directory at /usr/share/autoconf/Autom4te/FileUtils.pm line 326. [exec] autoreconf: failed to run aclocal: No such file or directory [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] An Ant BuildException has occured: The following error occurred while executing this line: /home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:62: exec returned: 1 [INFO] ------------------------------------------------------------------------ [INFO] For more information, run Maven with the -e switch [INFO] ------------------------------------------------------------------------ [INFO] Total time: 6 seconds [INFO] Finished at: Thu Jun 02 20:11:01 PDT 2011 [INFO] Final Memory: 32M/677M [INFO] ------------------------------------------------------------------------ If I install libtool I get further....If i reinstall automake I get further still..... I have to set JAVA_HOME it seems... If I redo the build I run into this issue: ... compilenative: [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/format': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/entries': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/text-base/SnappyCompressor.cc.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/text-base/org_apache_hadoop_io_compress_snappy_SnappyCompressor.h.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/text-base/SnappyDecompressor.cc.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/text-base/hadoop_snappy.h.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/text-base/org_apache_hadoop_io_compress_snappy_SnappyDecompressor.h.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /src/.svn/all-wcprops': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /.svn/format': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /.svn/entries': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /.svn/text-base/configure.ac.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /.svn/text-base/packageNativeHadoop.sh.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /.svn/text-base/Makefile.am.svn-base': Permission denied [exec] cp: cannot create regular file `/home/stack/hadoop-snappy-read-only/target/ native -src/ native /.svn/all-wcprops': Permission denied [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] An Ant BuildException has occured: The following error occurred while executing this line: /home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:54: exec returned: 1 So I have to manually remove the target dir..... Now I'm having same as this issue: http://code.google.com/p/hadoop-snappy/issues/detail?id=2 [exec] /usr/bin/ld: cannot find -ljvm [exec] collect2: ld returned 1 exit status [exec] make: *** [libhadoopsnappy.la] Error 1 [exec] libtool: link: g++ -shared -nostdlib /usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib/crti.o /usr/lib/gcc/x86_64-linux-gnu/4.3.3/crtbeginS.o src/.libs/SnappyCompressor.o src/.libs/SnappyDecompressor.o -L/usr/local/lib /usr/local/lib/libsnappy.so -ljvm -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3 -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../.. -lstdc++ -lm -lc -lgcc_s /usr/lib/gcc/x86_64-linux-gnu/4.3.3/crtendS.o /usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib/crtn.o -Wl,-soname -Wl,libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1 [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] An Ant BuildException has occured: The following error occurred while executing this line: /home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:91: exec returned: 2 ... Thought you might be interested in my experience Alejandro. Good stuff.
          Hide
          Alejandro Abdelnur added a comment -

          Thanks.

          Stack, you need to have autotools installed (autoconf, automake & libtool).

          Andrew, it is possible to do the same for LZO. We'd have to 'copy' the Mavenization done in hadoop-snappy to hadoop-lzo. Still, on the HBase side this would have to be done using a profile disabled by default (due to license issues we cannot bundled LZO stuff in HBase distro).

          Show
          Alejandro Abdelnur added a comment - Thanks. Stack, you need to have autotools installed (autoconf, automake & libtool). Andrew, it is possible to do the same for LZO. We'd have to 'copy' the Mavenization done in hadoop-snappy to hadoop-lzo. Still, on the HBase side this would have to be done using a profile disabled by default (due to license issues we cannot bundled LZO stuff in HBase distro).
          Hide
          Andrew Purtell added a comment -

          +1 I agree this is nice. From someone who recently integrated snappy and HBase support for it into an internal build, this would have saved that time. Likewise if we do similar for LZO. I might be interested in contributing similar for LZMA, through admittedly it is a specialist case.

          Show
          Andrew Purtell added a comment - +1 I agree this is nice. From someone who recently integrated snappy and HBase support for it into an internal build, this would have saved that time. Likewise if we do similar for LZO. I might be interested in contributing similar for LZMA, through admittedly it is a specialist case.
          Hide
          stack added a comment -

          @Alejandro

          This looks great.

          I tried it on a fresh machine where I had to setup a build environment. I installed snappy. I then moved to hadoop-snappy and tried doing mvn package. It seems like I have to run the command as root? Is that so for you? I then got stuck here:

          tack@sv4borg231:~/hadoop-snappy-read-only$ sudo ~/bin/mvn/bin/mvn package
          Warning: JAVA_HOME environment variable is not set.
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hadoop Snappy
          [INFO]    task-segment: [package]
          [INFO] ------------------------------------------------------------------------
          [INFO] [resources:resources {execution: default-resources}]
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /home/stack/hadoop-snappy-read-only/src/main/resources
          [INFO] [compiler:compile {execution: default-compile}]
          [INFO] Nothing to compile - all classes are up to date
          [INFO] [antrun:run {execution: compile}]
          [INFO] Executing tasks
          
          main:
          
          checkpreconditions:
          
          compilenative:
               [exec] Can't exec "libtoolize": No such file or directory at /usr/bin/autoreconf line 188.
               [exec] Use of uninitialized value $libtoolize in pattern match (m//) at /usr/bin/autoreconf line 188.
               [exec] Can't exec "aclocal": No such file or directory at /usr/share/autoconf/Autom4te/FileUtils.pm line 326.
               [exec] autoreconf: failed to run aclocal: No such file or directory
          

          This seems to be saying that I should have run a configure in here in hadoop-snappy-read-only first? Or I'm running 'mvn package' but i should have done something else first?

          I'm asking because i want to copy your instructions above into our manual here: http://hbase.apache.org/book/snappy.compression.html (Maybe you have suggestions on what to add here?)

          Thanks for the nice work Alejandro.

          Show
          stack added a comment - @Alejandro This looks great. I tried it on a fresh machine where I had to setup a build environment. I installed snappy. I then moved to hadoop-snappy and tried doing mvn package. It seems like I have to run the command as root? Is that so for you? I then got stuck here: tack@sv4borg231:~/hadoop-snappy-read-only$ sudo ~/bin/mvn/bin/mvn package Warning: JAVA_HOME environment variable is not set. [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Building Hadoop Snappy [INFO] task-segment: [ package ] [INFO] ------------------------------------------------------------------------ [INFO] [resources:resources {execution: default -resources}] [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /home/stack/hadoop-snappy-read-only/src/main/resources [INFO] [compiler:compile {execution: default -compile}] [INFO] Nothing to compile - all classes are up to date [INFO] [antrun:run {execution: compile}] [INFO] Executing tasks main: checkpreconditions: compilenative: [exec] Can't exec "libtoolize" : No such file or directory at /usr/bin/autoreconf line 188. [exec] Use of uninitialized value $libtoolize in pattern match (m //) at /usr/bin/autoreconf line 188. [exec] Can't exec "aclocal" : No such file or directory at /usr/share/autoconf/Autom4te/FileUtils.pm line 326. [exec] autoreconf: failed to run aclocal: No such file or directory This seems to be saying that I should have run a configure in here in hadoop-snappy-read-only first? Or I'm running 'mvn package' but i should have done something else first? I'm asking because i want to copy your instructions above into our manual here: http://hbase.apache.org/book/snappy.compression.html (Maybe you have suggestions on what to add here?) Thanks for the nice work Alejandro.
          Hide
          Alejandro Abdelnur added a comment -

          The attached patch modifies HBase POM to consume hadoop-snappy (JAR & SOs) as a regular maven dependencies (including the hadoop-snappy and snappy SO files).

          By default Hbase builds without including snappy, to enable snappy during for testing/packaging invoke maven with the '-Dsnappy' option.

          The TestCompressionTest testcase will assert SNAPPY compression based on the availability of the Snappy codec class (we should do the same for LZO)

          Regarding building/installing Hadoop-Snappy locally:

          Because hadoop-snappy is not currently available in public Maven repositories, you need to build and install hadoop-snappy in your local Maven cache. hadoop-snappy build is Mavenized, thus building/installing-locally hadoop-snappy is done with a 'mvn install'. To build hadoop-snappy you have to have snappy SO installed in your /usr/local/lib (or in an alternate directory and use -Dsnappy.lib= when building/installing hadoop-snappy)

          Show
          Alejandro Abdelnur added a comment - The attached patch modifies HBase POM to consume hadoop-snappy (JAR & SOs) as a regular maven dependencies (including the hadoop-snappy and snappy SO files). By default Hbase builds without including snappy, to enable snappy during for testing/packaging invoke maven with the '-Dsnappy' option. The TestCompressionTest testcase will assert SNAPPY compression based on the availability of the Snappy codec class (we should do the same for LZO) Regarding building/installing Hadoop-Snappy locally: Because hadoop-snappy is not currently available in public Maven repositories, you need to build and install hadoop-snappy in your local Maven cache. hadoop-snappy build is Mavenized, thus building/installing-locally hadoop-snappy is done with a 'mvn install'. To build hadoop-snappy you have to have snappy SO installed in your /usr/local/lib (or in an alternate directory and use -Dsnappy.lib= when building/installing hadoop-snappy)
          Hide
          Alejandro Abdelnur added a comment -

          snappy-java only provides snappy IO streams, it does not provide a Hadoop CompressionCodec for Snappy.

          Show
          Alejandro Abdelnur added a comment - snappy-java only provides snappy IO streams, it does not provide a Hadoop CompressionCodec for Snappy.
          Hide
          Nicolas Spiegelberg added a comment -

          Could we use snappy-java, which has Maven repo support already? http://code.google.com/p/snappy-java/#Using_with_Maven I'm not sure what the maturity of that project is.

          Show
          Nicolas Spiegelberg added a comment - Could we use snappy-java, which has Maven repo support already? http://code.google.com/p/snappy-java/#Using_with_Maven I'm not sure what the maturity of that project is.

            People

            • Assignee:
              Alejandro Abdelnur
              Reporter:
              Alejandro Abdelnur
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development