Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
0.22.0
-
None
-
None
Description
Execution of test-c++-libhdfs always fails.
Running
% ant test-c++-libhdfs -Dcompile.c++=yes -Dlibhdfs=yes
fails with the following diagnostic:
test-c++-libhdfs: [mkdir] Created dir: /homes/xxx/work/Hdfs.trunk/build/test/libhdfs ... [exec] /homes/xxx/work/Hdfs.trunk/src/c++/libhdfs/tests/test-libhdfs.sh [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ [exec] LIB_JVM_DIR = /usr/java/latest/jre/lib/i386/server [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ [exec] /homes/xxx/work/Hdfs.trunk/src/c++/libhdfs/tests/test-libhdfs.sh: line 118: /homes/xxx/work/Hdfs.trunk/bin/hadoop: No such file or directory [exec] CLASSPATH=/homes/xxx/work/Hdfs.trunk/src/c++/libhdfs/tests/conf:/homes/xxx/work/Hdfs.trunk/conf:/homes/xxx/work/Hdfs.trunk/src/c++/libhdfs/tests/conf:/homes/cot [exec] Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration [exec] Can't construct instance of class org.apache.hadoop.conf.Configuration [exec] Oops! Failed to connect to hdfs! [exec] exiting with 255 [exec] /homes/xxx/work/Hdfs.trunk/src/c++/libhdfs/tests/test-libhdfs.sh: line 126: /homes/xxx/work/Hdfs.trunk/bin/hadoop-daemon.sh: No such file or directory [exec] make: *** [test] Error 255
Attachments
Issue Links
- duplicates
-
HDFS-756 libhdfs unit tests do not run
- Closed