Affects Version/s: None
Fix Version/s: 0.20.0
x86_64 linux, x86_64 Java installed
Hadoop Flags:Incompatible change, Reviewed
Release Note:Changed build procedure for libhdfs to build correctly for different platforms. Build instructions are in the Jira item.
The makefile for libhdfs is hard-coded to compile 32bit libraries. It should perhaps compile dependent on which Java is set.
The relevant lines are:
LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)
$OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64 platform. So while gcc will try to link against the correct libjvm.so, it will fail because libhdfs is to be built 32bit (because of -m32)
The solution should be to specify -m32 or -m64 depending on the os.arch detected.
There are 3 cases to check:
- 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
- 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
- 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64