Details
Description
The makefile for libhdfs is hard-coded to compile 32bit libraries. It should perhaps compile dependent on which Java is set.
The relevant lines are:
LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)
$OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64 platform. So while gcc will try to link against the correct libjvm.so, it will fail because libhdfs is to be built 32bit (because of -m32)
[exec] /usr/bin/ld: skipping incompatible /usr/java64/latest/jre/lib/amd64/server/libjvm.so when searching for -ljvm [exec] /usr/bin/ld: cannot find -ljvm [exec] collect2: ld returned 1 exit status [exec] make: *** [/root/def/hadoop-0.16.3/build/libhdfs/libhdfs.so.1] Error 1
The solution should be to specify -m32 or -m64 depending on the os.arch detected.
There are 3 cases to check:
- 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
- 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
- 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64
Attachments
Attachments
Issue Links
- is part of
-
HADOOP-2318 All C++ builds should use the autoconf tools
- Closed