hive> create table abc(strct struct) > row format delimited > fields terminated by '\t' > collection items terminated by '\001'; OK Time taken: 0.293 seconds hive> > load data local inpath '../data/files/kv1.txt' > overwrite into table abc; Copying data from file:/home/mwagner/git/hive/data/files/kv1.txt Copying file: file:/home/mwagner/git/hive/data/files/kv1.txt Loading data to table default.abc Deleted hdfs://localhost:9000/user/hive/warehouse/abc Table default.abc stats: [num_partitions: 0, num_files: 1, num_rows: 0, total_size: 5812, raw_data_size: 0] OK Time taken: 0.339 seconds hive> describe abc; OK # col_name data_type comment strct struct None Time taken: 0.086 seconds, Fetched: 3 row(s) hive> select strct, count(strct) from abc group by strct; Total MapReduce jobs = 1 Launching Job 1 out of 1 Number of reduce tasks not specified. Estimated from input data size: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer= In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapred.reduce.tasks= Starting Job = job_201305011323_0008, Tracking URL = http://localhost:50030/jobdetails.jsp?jobid=job_201305011323_0008 Kill Command = /home/mwagner/hadoop/hadoop-1.0.4/libexec/../bin/hadoop job -kill job_201305011323_0008 Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1 2013-05-03 11:44:39,296 Stage-1 map = 0%, reduce = 0% 2013-05-03 11:45:09,428 Stage-1 map = 100%, reduce = 100% Ended Job = job_201305011323_0008 with errors Error during job, obtaining debugging information... Job Tracking URL: http://localhost:50030/jobdetails.jsp?jobid=job_201305011323_0008 Examining task ID: task_201305011323_0008_m_000002 (and more) from job job_201305011323_0008 Task with the most failures(4): ----- Task ID: task_201305011323_0008_m_000000 URL: http://localhost:50030/taskdetails.jsp?jobid=job_201305011323_0008&tipid=task_201305011323_0008_m_000000 ----- Diagnostic Messages for this Task: java.lang.RuntimeException: Hive Runtime Error while closing operators at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:230) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Hash code on complex types not supported yet. at org.apache.hadoop.hive.ql.exec.GroupByOperator.closeOp(GroupByOperator.java:1137) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:617) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:626) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:626) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:626) at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:197) ... 8 more Caused by: java.lang.RuntimeException: Hash code on complex types not supported yet. at org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils.hashCode(ObjectInspectorUtils.java:528) at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.processOp(ReduceSinkOperator.java:226) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:531) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:859) at org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:1066) at org.apache.hadoop.hive.ql.exec.GroupByOperator.closeOp(GroupByOperator.java:1118) ... 13 more FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched: Job 0: Map: 1 Reduce: 1 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec hive>