Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
0.24.0
-
None
-
None
-
Ubuntu 11.10+
Description
I noticed that the build of Hadoop trunk (0.24) and the 1.0/0.20.20x branches fail on Ubuntu 11.10 when trying to include the native code in the build. The reason is that the default behavior of ld was changed in Ubuntu 11.10.
Background
From Ubuntu 11.10 Release Notes:
The compiler passes by default two additional flags to the linker: [...snipp...] -Wl,--as-needed with this option the linker will only add a DT_NEEDED tag for a dynamic library mentioned on the command line if if the library is actually used.
This was apparently planned to be changed already back in 11.04 but was eventually reverted in the final release. From 11.04 Toolchain Transition:
Also in Natty, ld runs with the --as-needed option enabled by default. This means that, in the example above, if no symbols from libwheel were needed by racetrack, then libwheel would not be linked even if it was explicitly included in the command-line compiler flags. NOTE: The ld --as-needed default was reverted for the final natty release, and will be re-enabled in the o-series.
I already run into the same issue with Hadoop-LZO (https://github.com/kevinweil/hadoop-lzo/issues/33). See the link and the patch for more details. For Hadoop, the problematic configure script is native/configure.
How to reproduce
There are two ways to reproduce, depending on the OS you have at hand.
1. Use a stock Ubuntu 11.10 box and run a build that also compiles the native libs:
# in the top level directory of the 'hadoop-common' repo,
# i.e. where the BUILDING.txt file resides
$ mvn -Pnative compile
2. If you do not have Ubuntu 11.10 at hand, simply add -Wl,--as-needed explicitly to LDFLAGS. This configures ld to work like Ubuntu 11.10's default behavior.
Error message (for trunk/0.24)
Running the above build command will produce the following output (I added -e -X switches to mvn).
[DEBUG] Executing: /bin/sh -l -c cd /home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native && make DESTDIR=/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/target install [INFO] /bin/bash ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/default-java/include -I/usr/lib/jvm/default-java/include/linux -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/src -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah -I/usr/local/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c -o ZlibCompressor.lo `test -f 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo './'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c [INFO] libtool: compile: gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/default-java/include -I/usr/lib/jvm/default-java/include/linux -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/src -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah -I/usr/local/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c -fPIC -DPIC -o .libs/ZlibCompressor.o [INFO] src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c: In function 'Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs': [INFO] src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:71:41: error: expected expression before ',' token [INFO] make: *** [ZlibCompressor.lo] Error 1
How to fix
The fix involves adding proper settings for LDFLAGS to the build config. In trunk, this is hadoop-common-project/hadoop-common/pom.xml. In branches 1.0 and 0.20.20x, this is build.xml.
Basically, the fix explicitly adds -Wl,--no-as-needed to LDFLAGS. Special care must be taken not to add this option when running on Mac OS as its version of ld does not support this option (and does not need it because by default it behaves as desired).
Attachments
Attachments
Issue Links
- relates to
-
HADOOP-7868 Hadoop native fails to compile when default linker option is -Wl,--as-needed
-
- Closed
-