hive> add jar hdfs://localhost:8020/user/hive-udf-geo-ip-jtg.jar > ; Resource hdfs://localhost:8020/user/hive-udf-geo-ip-jtg.jar already added. hive> add jar hdfs://localhost:8020/user/geo-ip-java.jar > ; Resource hdfs://localhost:8020/user/geo-ip-java.jar already added. create temporary function geoip as 'com.jointhegrid.hive.udf.GenericUDFGeoIP'; hive> select geoip(theIp ,'COUNTRY_NAME', './GeoLiteCity.dat.gz' ) from ip ; java.lang.ClassNotFoundException: com.jointhegrid.hive.udf.GenericUDFGeoIP Continuing ... java.lang.NullPointerException: target should not be null Continuing ... Total MapReduce jobs = 1 Launching Job 1 out of 1 plan = /tmp/hive-edward/plan6312598599158489653.xml Number of reduce tasks is set to 0 since there's no reduce operator 10/03/23 12:30:13 INFO exec.ExecDriver: Number of reduce tasks is set to 0 since there's no reduce operator 10/03/23 12:30:13 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:13 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:13 INFO exec.ExecDriver: Using org.apache.hadoop.hive.ql.io.HiveInputFormat 10/03/23 12:30:13 INFO exec.ExecDriver: adding libjars: file:///tmp/hive-edward/hive-udf-geo-ip-jtg.jar1350631202064072065.jar,file:///tmp/hive-edward/geo-ip-java.jar1060610261672922418.jar 10/03/23 12:30:13 INFO exec.ExecDriver: Processing alias ip 10/03/23 12:30:13 INFO exec.ExecDriver: Adding input file hdfs://localhost/user/hive/warehouse/ip 10/03/23 12:30:13 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:14 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:14 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 10/03/23 12:30:14 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 10/03/23 12:30:14 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 INFO mapred.FileInputFormat: Total input paths to process : 1 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:15 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:16 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. Job running in-process (local Hadoop) 10/03/23 12:30:16 INFO exec.ExecDriver: Job running in-process (local Hadoop) 10/03/23 12:30:16 INFO mapred.FileInputFormat: Total input paths to process : 1 10/03/23 12:30:16 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:16 INFO mapred.MapTask: numReduceTasks: 0 10/03/23 12:30:16 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:16 INFO ExecMapper: maximum memory = 932118528 10/03/23 12:30:16 INFO ExecMapper: conf classpath = [file:/tmp/hadoop-edward/hadoop-unjar3680751082130582389/, file:/home/edward/hive/hive/build/dist/lib/hive-exec-0.6.0.jar, file:/tmp/hadoop-edward/hadoop-unjar3680751082130582389/classes, file:/tmp/hive-edward/hive-udf-geo-ip-jtg.jar1350631202064072065.jar, file:/tmp/hive-edward/geo-ip-java.jar1060610261672922418.jar] 10/03/23 12:30:16 INFO ExecMapper: thread classpath = [file:/tmp/hadoop-edward/hadoop-unjar3680751082130582389/, file:/home/edward/hive/hive/build/dist/lib/hive-exec-0.6.0.jar, file:/tmp/hadoop-edward/hadoop-unjar3680751082130582389/classes, file:/tmp/hive-edward/hive-udf-geo-ip-jtg.jar1350631202064072065.jar, file:/tmp/hive-edward/geo-ip-java.jar1060610261672922418.jar] 10/03/23 12:30:16 INFO exec.MapOperator: Adding alias ip to work list for file /user/hive/warehouse/ip/ips.txt 10/03/23 12:30:16 INFO exec.MapOperator: dump TS struct 10/03/23 12:30:16 INFO ExecMapper: Id =6 Id =0 Id =1 Id =2 Id = 1 null<\Parent> <\FS> <\Children> Id = 0 null<\Parent> <\SEL> <\Children> Id = 6 null<\Parent> <\TS> <\Children> <\MAP> 10/03/23 12:30:16 INFO exec.MapOperator: Initializing Self 6 MAP 10/03/23 12:30:16 INFO exec.TableScanOperator: Initializing Self 0 TS 10/03/23 12:30:16 INFO exec.TableScanOperator: Operator 0 TS initialized 10/03/23 12:30:16 INFO exec.TableScanOperator: Initializing children of 0 TS 10/03/23 12:30:16 INFO exec.SelectOperator: Initializing child 1 SEL 10/03/23 12:30:16 INFO exec.SelectOperator: Initializing Self 1 SEL 10/03/23 12:30:16 INFO exec.SelectOperator: SELECT struct 10/03/23 12:30:16 WARN mapred.LocalJobRunner: job_local_0001 java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) ... 5 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34) ... 10 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) ... 13 more Caused by: java.lang.RuntimeException: Map operator initialization failed at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:160) ... 18 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:80) at org.apache.hadoop.hive.ql.exec.Operator.initEvaluators(Operator.java:820) at org.apache.hadoop.hive.ql.exec.Operator.initEvaluatorsAndReturnStruct(Operator.java:832) at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:60) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:350) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:426) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:382) at org.apache.hadoop.hive.ql.exec.Operator.initializeOp(Operator.java:367) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:350) at org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:358) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:350) at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:126) ... 18 more 2010-03-23 12:30:17,212 null map = 0%, reduce = 0% 10/03/23 12:30:17 INFO exec.ExecDriver: 2010-03-23 12:30:17,212 null map = 0%, reduce = 0% Ended Job = job_local_0001 with errors 10/03/23 12:30:17 ERROR exec.ExecDriver: Ended Job = job_local_0001 with errors 10/03/23 12:30:17 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. 10/03/23 12:30:17 WARN fs.FileSystem: "localhost:8020" is a deprecated filesystem name. Use "hdfs://localhost:8020/" instead. Job Failed 10/03/23 12:30:17 ERROR hdfs.DFSClient: Exception closing file /tmp/hive-edward/-1304534476/_temporary/_attempt_local_0001_m_000000_0/part-00000 : org.apache.hadoop.ipc.RemoteException: java.io.IOException: Could not complete write to file /tmp/hive-edward/-1304534476/_temporary/_attempt_local_0001_m_000000_0/part-00000 by DFSClient_-913439547 at org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) org.apache.hadoop.ipc.RemoteException: java.io.IOException: Could not complete write to file /tmp/hive-edward/-1304534476/_temporary/_attempt_local_0001_m_000000_0/part-00000 by DFSClient_-913439547 at org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) at org.apache.hadoop.ipc.Client.call(Client.java:740) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) at $Proxy0.complete(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) at $Proxy0.complete(Unknown Source) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeInternal(DFSClient.java:3264) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.close(DFSClient.java:3188) at org.apache.hadoop.hdfs.DFSClient$LeaseChecker.close(DFSClient.java:1043) at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:237) at org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:269) at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:1424) at org.apache.hadoop.fs.FileSystem.closeAll(FileSystem.java:217) at org.apache.hadoop.fs.FileSystem$ClientFinalizer.run(FileSystem.java:202) FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask hive>