Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
0.9.0, 0.11.0, 0.12.0
-
None
-
Hadoop 23.X
-
hbase-handler, hbase,
Description
After upgrading to Hadoop 23 and HBase 0.94.5 compiled for Hadoop 23. The TestHBaseMinimrCliDriver, fails after performing the following steps
Update "hbase_bulk.m" with the following properties
set mapreduce.totalorderpartitioner.naturalorder=false;
set mapreduce.totalorderpartitioner.path=/tmp/hbpartition.lst;
Otherwise I keep seeing: "_partition.lst" not found exception in the mappers, even though set total.order.partitioner.path=/tmp/hbpartition.lst is set.
When the test runs, the 3 reducer phase of the second query fails with the following error, but the MiniMRCluster keeps spinning up new reducer and the test is stuck infinitely.
insert overwrite table hbsort select distinct value, case when key=103 then cast(null as string) else key end, case when key=103 then '' else cast(key+1 as string) end from src cluster by value;
The stack trace I see in the syslog for the Node Manager is the following:
==============================================================
13-03-20 16:26:48,942 FATAL [IPC Server handler 17 on 55996] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1363821864968_0003_r_000002_0 - exited : java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":
,"value":
{"_col0":"val_200","_col1":"200","_col2":"201.0"},"alias":0}
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:268)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:448)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:399)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1212)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":
,"value":
{"_col0":"val_200","_col1":"200","_col2":"201.0"},"alias":0}
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:256)
... 7 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:237)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:477)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:525)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762)
at org.apache.hadoop.hive.ql.exec.ExtractOperator.processOp(ExtractOperator.java:45)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:247)
... 7 more
Caused by: java.lang.NullPointerException
at org.apache.hadoop.mapreduce.TaskID$CharTaskTypeMaps.getRepresentingCharacter(TaskID.java:265)
at org.apache.hadoop.mapreduce.TaskID.appendTo(TaskID.java:153)
at org.apache.hadoop.mapreduce.TaskAttemptID.appendTo(TaskAttemptID.java:119)
at org.apache.hadoop.mapreduce.TaskAttemptID.toString(TaskAttemptID.java:151)
at java.lang.String.valueOf(String.java:2826)
at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.getTaskAttemptPath(FileOutputCommitter.java:209)
at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.<init>(FileOutputCommitter.java:69)
at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.getRecordWriter(HFileOutputFormat.java:90)
at org.apache.hadoop.hive.hbase.HiveHFileOutputFormat.getFileWriter(HiveHFileOutputFormat.java:67)
at org.apache.hadoop.hive.hbase.HiveHFileOutputFormat.getHiveRecordWriter(HiveHFileOutputFormat.java:104)
at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:246)
at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:234)
... 14 more
==============================================================
Attachments
Attachments
Issue Links
- is related to
-
MAPREDUCE-5452 NPE in TaskID toString when default constructor is used
- Open
-
HIVE-3949 Some test failures in hadoop 23
- Resolved