Log Type: syslog Log Length: 73054 2013-12-06 14:04:51,771 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 2013-12-06 14:04:51,827 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 2013-12-06 14:04:51,828 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ReduceTask metrics system started 2013-12-06 14:04:51,837 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens: 2013-12-06 14:04:51,837 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1384857622207_297273, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@6632060c) 2013-12-06 14:04:51,897 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now. 2013-12-06 14:04:52,156 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273 2013-12-06 14:04:52,304 WARN [main] org.apache.hadoop.conf.Configuration: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 2013-12-06 14:04:52,304 WARN [main] org.apache.hadoop.conf.Configuration: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 2013-12-06 14:04:52,305 WARN [main] org.apache.hadoop.conf.Configuration: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 2013-12-06 14:04:52,306 WARN [main] org.apache.hadoop.conf.Configuration: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 2013-12-06 14:04:52,307 WARN [main] org.apache.hadoop.conf.Configuration: mapred.local.dir is deprecated. Instead, use mapreduce.cluster.local.dir 2013-12-06 14:04:52,307 WARN [main] org.apache.hadoop.conf.Configuration: job.local.dir is deprecated. Instead, use mapreduce.job.local.dir 2013-12-06 14:04:52,307 WARN [main] org.apache.hadoop.conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 2013-12-06 14:04:52,308 WARN [main] org.apache.hadoop.conf.Configuration: mapred.cache.localFiles is deprecated. Instead, use mapreduce.job.cache.local.files 2013-12-06 14:04:52,308 WARN [main] org.apache.hadoop.conf.Configuration: mapred.job.id is deprecated. Instead, use mapreduce.job.id 2013-12-06 14:04:52,512 WARN [main] org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id 2013-12-06 14:04:52,803 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.yarn.util.LinuxResourceCalculatorPlugin@6e848ecc 2013-12-06 14:04:52,976 INFO [main] com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library 2013-12-06 14:04:52,978 INFO [main] com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev eac5ee2101c76aab53e1f6eded7350e1915cf161] 2013-12-06 14:04:52,981 INFO [main] org.apache.hadoop.mapred.ReduceTask: Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@6ac67a88 2013-12-06 14:04:52,999 INFO [main] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: MergerManager: memoryLimit=233046016, maxSingleShuffleLimit=58261504, mergeThreshold=153810384, ioSortFactor=20, memToMemMergeOutputsThreshold=20 2013-12-06 14:04:53,001 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher: attempt_1384857622207_297273_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events 2013-12-06 14:04:53,015 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: Assiging kpi25:8080 with 1 to fetcher#10 2013-12-06 14:04:53,016 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: assigned 1 of 1 to kpi25:8080 to fetcher#10 2013-12-06 14:04:53,016 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: Assiging kpi86:8080 with 1 to fetcher#1 2013-12-06 14:04:53,016 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher: attempt_1384857622207_297273_r_000000_0: Got 9 new map-outputs 2013-12-06 14:04:53,017 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: Assiging kpi94:8080 with 5 to fetcher#2 2013-12-06 14:04:53,017 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: assigned 5 of 5 to kpi94:8080 to fetcher#2 2013-12-06 14:04:53,017 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: Assiging kpi18:8080 with 1 to fetcher#3 2013-12-06 14:04:53,017 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: assigned 1 of 1 to kpi18:8080 to fetcher#3 2013-12-06 14:04:53,017 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: Assiging kpi71:8080 with 1 to fetcher#4 2013-12-06 14:04:53,017 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: assigned 1 of 1 to kpi71:8080 to fetcher#4 2013-12-06 14:04:53,017 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: assigned 1 of 1 to kpi86:8080 to fetcher#1 2013-12-06 14:04:53,239 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.Fetcher: for url=8080/mapOutput?job=job_1384857622207_297273&reduce=0&map=attempt_1384857622207_297273_m_000007_0 sent hash and receievd reply 2013-12-06 14:04:53,239 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.Fetcher: for url=8080/mapOutput?job=job_1384857622207_297273&reduce=0&map=attempt_1384857622207_297273_m_000008_0 sent hash and receievd reply 2013-12-06 14:04:53,239 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: for url=8080/mapOutput?job=job_1384857622207_297273&reduce=0&map=attempt_1384857622207_297273_m_000005_0,attempt_1384857622207_297273_m_000003_0,attempt_1384857622207_297273_m_000001_0,attempt_1384857622207_297273_m_000004_0,attempt_1384857622207_297273_m_000002_0 sent hash and receievd reply 2013-12-06 14:04:53,239 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.Fetcher: for url=8080/mapOutput?job=job_1384857622207_297273&reduce=0&map=attempt_1384857622207_297273_m_000006_0 sent hash and receievd reply 2013-12-06 14:04:53,239 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.Fetcher: for url=8080/mapOutput?job=job_1384857622207_297273&reduce=0&map=attempt_1384857622207_297273_m_000009_0 sent hash and receievd reply 2013-12-06 14:04:53,242 WARN [fetcher#10] org.apache.hadoop.conf.Configuration: hadoop.native.lib is deprecated. Instead, use io.native.lib.available 2013-12-06 14:04:53,246 INFO [fetcher#10] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:53,246 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#10 about to shuffle output of map attempt_1384857622207_297273_m_000009_0 decomp: 149576 len: 43556 to MEMORY 2013-12-06 14:04:53,252 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput: Read 149576 bytes from map-output for attempt_1384857622207_297273_m_000009_0 2013-12-06 14:04:53,252 INFO [fetcher#1] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:53,252 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#1 about to shuffle output of map attempt_1384857622207_297273_m_000008_0 decomp: 45857494 len: 12693010 to MEMORY 2013-12-06 14:04:53,252 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000006_0: Shuffling to disk since 67352895 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:53,258 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000007_0: Shuffling to disk since 67386772 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:53,258 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#3 about to shuffle output of map attempt_1384857622207_297273_m_000006_0 decomp: 67352895 len: 21823814 to DISK 2013-12-06 14:04:53,262 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#4 about to shuffle output of map attempt_1384857622207_297273_m_000007_0 decomp: 67386772 len: 17528951 to DISK 2013-12-06 14:04:53,262 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000005_0: Shuffling to disk since 108532983 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:53,267 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#2 about to shuffle output of map attempt_1384857622207_297273_m_000005_0 decomp: 108532983 len: 33501394 to DISK 2013-12-06 14:04:53,267 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: closeInMemoryFile -> map-output of size: 149576, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->46007070 2013-12-06 14:04:53,268 INFO [fetcher#10] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: kpi25:8080 freed by fetcher#10 in 253s 2013-12-06 14:04:53,730 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 17528951 bytes from map-output for attempt_1384857622207_297273_m_000007_0 2013-12-06 14:04:53,732 INFO [fetcher#4] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: kpi71:8080 freed by fetcher#4 in 715s 2013-12-06 14:04:53,850 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 21823814 bytes from map-output for attempt_1384857622207_297273_m_000006_0 2013-12-06 14:04:53,851 INFO [fetcher#3] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: kpi18:8080 freed by fetcher#3 in 834s 2013-12-06 14:04:53,924 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput: Read 45857494 bytes from map-output for attempt_1384857622207_297273_m_000008_0 2013-12-06 14:04:53,924 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: closeInMemoryFile -> map-output of size: 45857494, inMemoryMapOutputs.size() -> 2, commitMemory -> 149576, usedMemory ->46007070 2013-12-06 14:04:53,924 INFO [fetcher#1] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: kpi86:8080 freed by fetcher#1 in 908s 2013-12-06 14:04:54,399 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 33501394 bytes from map-output for attempt_1384857622207_297273_m_000005_0 2013-12-06 14:04:54,425 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000003_0: Shuffling to disk since 134465355 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:54,432 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#2 about to shuffle output of map attempt_1384857622207_297273_m_000003_0 decomp: 134465355 len: 40341884 to DISK 2013-12-06 14:04:55,215 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 40341884 bytes from map-output for attempt_1384857622207_297273_m_000003_0 2013-12-06 14:04:55,220 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000001_0: Shuffling to disk since 134411062 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:55,224 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#2 about to shuffle output of map attempt_1384857622207_297273_m_000001_0 decomp: 134411062 len: 40218165 to DISK 2013-12-06 14:04:56,310 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 40218165 bytes from map-output for attempt_1384857622207_297273_m_000001_0 2013-12-06 14:04:56,316 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000004_0: Shuffling to disk since 134609556 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:56,320 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#2 about to shuffle output of map attempt_1384857622207_297273_m_000004_0 decomp: 134609556 len: 40548607 to DISK 2013-12-06 14:04:56,834 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 40548607 bytes from map-output for attempt_1384857622207_297273_m_000004_0 2013-12-06 14:04:56,841 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000002_0: Shuffling to disk since 134543452 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:56,846 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#2 about to shuffle output of map attempt_1384857622207_297273_m_000002_0 decomp: 134543452 len: 40558092 to DISK 2013-12-06 14:04:57,580 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 40558092 bytes from map-output for attempt_1384857622207_297273_m_000002_0 2013-12-06 14:04:57,581 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: kpi94:8080 freed by fetcher#2 in 4564s 2013-12-06 14:04:58,028 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: Assiging kpi94:8080 with 1 to fetcher#2 2013-12-06 14:04:58,028 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher: attempt_1384857622207_297273_r_000000_0: Got 1 new map-outputs 2013-12-06 14:04:58,028 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: assigned 1 of 1 to kpi94:8080 to fetcher#2 2013-12-06 14:04:58,029 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: for url=8080/mapOutput?job=job_1384857622207_297273&reduce=0&map=attempt_1384857622207_297273_m_000000_0 sent hash and receievd reply 2013-12-06 14:04:58,030 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: attempt_1384857622207_297273_m_000000_0: Shuffling to disk since 175494839 is greater than maxSingleShuffleLimit (58261504) 2013-12-06 14:04:58,033 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.Fetcher: fetcher#2 about to shuffle output of map attempt_1384857622207_297273_m_000000_0 decomp: 175494839 len: 51552435 to DISK 2013-12-06 14:04:58,591 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.OnDiskMapOutput: Read 51552435 bytes from map-output for attempt_1384857622207_297273_m_000000_0 2013-12-06 14:04:58,592 INFO [fetcher#2] org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler: kpi94:8080 freed by fetcher#2 in 564s 2013-12-06 14:04:58,593 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher: EventFetcher is interrupted.. Returning 2013-12-06 14:04:58,598 INFO [main] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: finalMerge called with 2 in-memory map-outputs and 8 on-disk map-outputs 2013-12-06 14:04:58,665 INFO [main] org.apache.hadoop.mapred.Merger: Merging 2 sorted segments 2013-12-06 14:04:58,665 INFO [main] org.apache.hadoop.mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 46007035 bytes 2013-12-06 14:04:58,669 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new compressor [.lzo] 2013-12-06 14:04:59,640 INFO [main] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: Merged 2 segments, 46007070 bytes to disk to satisfy reduce memory limit 2013-12-06 14:04:59,641 INFO [main] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: Merging 9 files, 298821204 bytes from disk 2013-12-06 14:04:59,642 INFO [main] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl: Merging 0 segments, 0 bytes from memory into reduce 2013-12-06 14:04:59,642 INFO [main] org.apache.hadoop.mapred.Merger: Merging 9 sorted segments 2013-12-06 14:04:59,648 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,650 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,651 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,653 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,654 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,656 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,657 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.lzo] 2013-12-06 14:04:59,658 INFO [main] org.apache.hadoop.mapred.Merger: Down to the last merge-pass, with 9 segments left of total size: 1002220907 bytes 2013-12-06 14:04:59,669 INFO [main] ExecReducer: maximum memory = 466092032 2013-12-06 14:04:59,669 INFO [main] ExecReducer: conf classpath = [file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/container_1384857622207_297273_01_000012/, file:/var/run/cloudera-scm-agent/process/7065-yarn-NODEMANAGER/, file:/etc/hbase/conf.cloudera.hbase1/, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-annotations-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-auth-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-annotations-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-auth-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-common-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jaxb-api-2.2.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/junit-4.8.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-lang-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jetty-util-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/hue-plugins-2.2.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-math-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/kfs-0.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jetty-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-configuration-1.6.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-codec-1.4.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-digester-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jline-0.9.94.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jasper-compiler-5.5.23.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/activation-1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/snappy-java-1.0.4.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-net-3.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/slf4j-api-1.6.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/xmlenc-0.52.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jets3t-0.6.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jettison-1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-httpclient-3.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/zookeeper/zookeeper-3.4.5-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jsp-api-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/servlet-api-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-cli-1.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-collections-3.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-beanutils-1.7.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/stax-api-1.0.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/guava-11.0.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/avro-1.7.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/paranamer-2.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jsch-0.1.42.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jasper-runtime-5.5.23.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/mockito-all-1.8.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-el-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jsr305-1.3.9.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jersey-json-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-logging-1.1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-xc-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.1-1009.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.1-1009.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-lang-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-codec-1.4.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jline-0.9.94.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/xmlenc-0.52.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/zookeeper-3.4.5-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jsp-api-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-daemon-1.0.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/servlet-api-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-cli-1.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/guava-11.0.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-el-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jsr305-1.3.9.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-logging-1.1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-distcp-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-gridmix-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-datajoin-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-rumen-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-archives-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-extras-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-gridmix-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-archives-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-extras-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-distcp-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-rumen-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-datajoin-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/javax.inject-1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jersey-guice-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/netty-3.2.4.Final.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/avro-1.7.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/paranamer-2.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/guice-3.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-client-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-site-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-client-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-api-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-api-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-site-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/javax.inject-1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/guice-servlet-3.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/aopalliance-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/snappy-java-1.0.4.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jersey-guice-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/netty-3.2.4.Final.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/avro-1.7.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/paranamer-2.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/guice-3.0.jar, file:/opt/cloudera/parcels/HADOOP_LZO-0.4.15-1.gplextras.p0.24/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar, file:/opt/cloudera/parcels/HADOOP_LZO-0.4.15-1.gplextras.p0.24/lib/hadoop/lib/native/, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/job.jar, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/classes, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/lib/*, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/filecache/-5305035836914605796/hiveutil.jar, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/filecache/845548897392055020/hive-builtins-0.10.0-cdh4.2.1.jar] 2013-12-06 14:04:59,670 INFO [main] ExecReducer: thread classpath = [file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/container_1384857622207_297273_01_000012/, file:/var/run/cloudera-scm-agent/process/7065-yarn-NODEMANAGER/, file:/etc/hbase/conf.cloudera.hbase1/, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-annotations-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-auth-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-annotations-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-auth-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-common-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/hadoop-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jaxb-api-2.2.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/junit-4.8.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-lang-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jetty-util-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/hue-plugins-2.2.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-math-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/kfs-0.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jetty-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-configuration-1.6.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-codec-1.4.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-digester-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jline-0.9.94.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jasper-compiler-5.5.23.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/activation-1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/snappy-java-1.0.4.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-net-3.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/slf4j-api-1.6.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/xmlenc-0.52.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jets3t-0.6.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jettison-1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-httpclient-3.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/zookeeper/zookeeper-3.4.5-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jsp-api-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/servlet-api-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-cli-1.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-collections-3.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-beanutils-1.7.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/stax-api-1.0.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/guava-11.0.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/avro-1.7.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/paranamer-2.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jsch-0.1.42.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jasper-runtime-5.5.23.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/mockito-all-1.8.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-el-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jsr305-1.3.9.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jersey-json-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/commons-logging-1.1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop/lib/jackson-xc-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.1-1009.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.1-1009.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-lang-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-codec-1.4.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jline-0.9.94.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/xmlenc-0.52.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/zookeeper-3.4.5-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jsp-api-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-daemon-1.0.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/servlet-api-2.5.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-cli-1.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/guava-11.0.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-el-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/jsr305-1.3.9.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-hdfs/lib/commons-logging-1.1.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-distcp-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-gridmix-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-datajoin-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-rumen-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-archives-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-extras-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-gridmix-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-archives-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-extras-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-distcp-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-streaming-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-rumen-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/hadoop-datajoin-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/javax.inject-1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jersey-guice-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/netty-3.2.4.Final.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/avro-1.7.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/paranamer-2.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-mapreduce/lib/guice-3.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-client-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.2.1-tests.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-tests-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-site-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-client-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-api-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-api-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-site-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-common-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.0.0-cdh4.2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/javax.inject-1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/guice-servlet-3.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/aopalliance-1.0.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/asm-3.2.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/commons-io-2.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/log4j-1.2.17.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/snappy-java-1.0.4.1.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jersey-server-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jersey-guice-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/protobuf-java-2.4.0a.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/jersey-core-1.8.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/netty-3.2.4.Final.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/avro-1.7.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/paranamer-2.3.jar, file:/opt/cloudera/parcels/CDH-4.2.1-1.cdh4.2.1.p0.5/lib/hadoop-yarn/lib/guice-3.0.jar, file:/opt/cloudera/parcels/HADOOP_LZO-0.4.15-1.gplextras.p0.24/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar, file:/opt/cloudera/parcels/HADOOP_LZO-0.4.15-1.gplextras.p0.24/lib/hadoop/lib/native/, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/job.jar, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/classes, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/lib/*, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/filecache/-5305035836914605796/hiveutil.jar, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/filecache/2983273881884082399/job.jar/, file:/home1/hadoop/yarn_nm2/local-dir/usercache/kpi/filecache/845548897392055020/hive-builtins-0.10.0-cdh4.2.1.jar] 2013-12-06 14:04:59,697 WARN [main] org.apache.hadoop.hive.conf.HiveConf: hive-site.xml not found on CLASSPATH 2013-12-06 14:04:59,923 INFO [main] ExecReducer: Id =6 Id =7 Id =8 Id = 7 null<\Parent> <\FS> <\Children> Id = 6 null<\Parent> <\SEL> <\Children> <\JOIN> 2013-12-06 14:04:59,923 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: Initializing Self 6 JOIN 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.CommonJoinOperator: JOIN struct<_col2:string,_col3:string> totalsz = 2 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: Operator 6 JOIN initialized 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: Initializing children of 6 JOIN 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: Initializing child 7 SEL 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: Initializing Self 7 SEL 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: SELECT struct<_col2:string,_col3:string> 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: Operator 7 SEL initialized 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: Initializing children of 7 SEL 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initializing child 8 FS 2013-12-06 14:04:59,937 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initializing Self 8 FS 2013-12-06 14:04:59,938 WARN [main] org.apache.hadoop.conf.Configuration: mapred.healthChecker.script.timeout is deprecated. Instead, use mapreduce.tasktracker.healthchecker.script.timeout 2013-12-06 14:04:59,940 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: Operator 8 FS initialized 2013-12-06 14:04:59,940 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: Initialization Done 8 FS 2013-12-06 14:04:59,940 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: Initialization Done 7 SEL 2013-12-06 14:04:59,967 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: Initialization Done 6 JOIN 2013-12-06 14:04:59,972 INFO [main] ExecReducer: ExecReducer: processing 1 rows: used memory = 85817552 2013-12-06 14:04:59,978 INFO [main] ExecReducer: ExecReducer: processing 10 rows: used memory = 85817552 2013-12-06 14:04:59,984 INFO [main] ExecReducer: ExecReducer: processing 100 rows: used memory = 85817552 2013-12-06 14:05:00,054 INFO [main] ExecReducer: ExecReducer: processing 1000 rows: used memory = 87120008 2013-12-06 14:05:00,294 INFO [main] org.apache.hadoop.hive.ql.exec.CommonJoinOperator: table 0 has 1000 rows for join key [00000000-0000-0000-0000-000000000000] 2013-12-06 14:05:00,389 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: 6 forwarding 1 rows 2013-12-06 14:05:00,389 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: 7 forwarding 1 rows 2013-12-06 14:05:00,390 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: Final Path: FS hdfs://mycluster/tmp/hive-beeswax-kpi/hive_2013-12-06_13-56-49_404_7454927883229301634/_tmp.-mr-10006/000000_0 2013-12-06 14:05:00,390 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: Writing to temp file: FS hdfs://mycluster/tmp/hive-beeswax-kpi/hive_2013-12-06_13-56-49_404_7454927883229301634/_task_tmp.-mr-10006/_tmp.000000_0 2013-12-06 14:05:00,390 INFO [main] org.apache.hadoop.hive.ql.exec.FileSinkOperator: New Final Path: FS hdfs://mycluster/tmp/hive-beeswax-kpi/hive_2013-12-06_13-56-49_404_7454927883229301634/_tmp.-mr-10006/000000_0 2013-12-06 14:05:00,494 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: 6 forwarding 10 rows 2013-12-06 14:05:00,494 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: 7 forwarding 10 rows 2013-12-06 14:05:00,497 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: 6 forwarding 100 rows 2013-12-06 14:05:00,498 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: 7 forwarding 100 rows 2013-12-06 14:05:00,529 INFO [main] org.apache.hadoop.hive.ql.exec.JoinOperator: 6 forwarding 1000 rows 2013-12-06 14:05:00,529 INFO [main] org.apache.hadoop.hive.ql.exec.SelectOperator: 7 forwarding 1000 rows 2013-12-06 14:05:00,614 INFO [main] ExecReducer: ExecReducer: processing 10000 rows: used memory = 102248760 2013-12-06 14:05:00,627 INFO [main] org.apache.hadoop.hive.ql.exec.CommonJoinOperator: table 0 has 1000 rows for join key [00:00:00:00:00:00] 2013-12-06 14:05:00,708 INFO [main] org.apache.hadoop.hive.ql.exec.persistence.RowContainer: RowContainer created temp file /home1/hadoop/yarn_nm2/local-dir/usercache/kpi/appcache/application_1384857622207_297273/container_1384857622207_297273_01_000012/tmp/hive-rowcontainer1099388164015854003/RowContainer2188029813644287132.[00:00:00:00:00:00].tmp 2013-12-06 14:05:00,710 WARN [main] org.apache.hadoop.conf.Configuration: fs.default.name is deprecated. Instead, use fs.defaultFS 2013-12-06 14:05:00,713 ERROR [main] org.apache.hadoop.hive.ql.exec.persistence.RowContainer: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.fs.Path.initialize(Path.java:157) at org.apache.hadoop.fs.Path.(Path.java:135) at org.apache.hadoop.fs.Path.(Path.java:58) at org.apache.hadoop.fs.ChecksumFileSystem.getChecksumFile(ChecksumFileSystem.java:83) at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:394) at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:439) at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886) at org.apache.hadoop.io.SequenceFile$Writer.(SequenceFile.java:1060) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:270) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:369) at org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:989) at org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64) at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:250) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.spillBlock(RowContainer.java:318) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.add(RowContainer.java:164) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.add(RowContainer.java:74) at org.apache.hadoop.hive.ql.exec.JoinOperator.processOp(JoinOperator.java:131) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:474) at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:249) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:460) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:407) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152) Caused by: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at java.net.URI.checkPath(URI.java:1788) at java.net.URI.(URI.java:734) at org.apache.hadoop.fs.Path.initialize(Path.java:154) ... 26 more 2013-12-06 14:05:00,716 FATAL [main] ExecReducer: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"joinkey0":"00:00:00:00:00:00"},"value":{"_col2":"a10000277b93d8","_col3":"00:00:00:00:00:00"},"alias":0} at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:258) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:460) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:407) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.hive.ql.exec.JoinOperator.processOp(JoinOperator.java:134) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:474) at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:249) ... 7 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.spillBlock(RowContainer.java:353) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.add(RowContainer.java:164) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.add(RowContainer.java:74) at org.apache.hadoop.hive.ql.exec.JoinOperator.processOp(JoinOperator.java:131) ... 9 more Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.fs.Path.initialize(Path.java:157) at org.apache.hadoop.fs.Path.(Path.java:135) at org.apache.hadoop.fs.Path.(Path.java:58) at org.apache.hadoop.fs.ChecksumFileSystem.getChecksumFile(ChecksumFileSystem.java:83) at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:394) at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:439) at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886) at org.apache.hadoop.io.SequenceFile$Writer.(SequenceFile.java:1060) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:270) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:369) at org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:989) at org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64) at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:250) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.spillBlock(RowContainer.java:318) ... 12 more Caused by: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at java.net.URI.checkPath(URI.java:1788) at java.net.URI.(URI.java:734) at org.apache.hadoop.fs.Path.initialize(Path.java:154) ... 26 more 2013-12-06 14:05:00,716 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"joinkey0":"00:00:00:00:00:00"},"value":{"_col2":"a10000277b93d8","_col3":"00:00:00:00:00:00"},"alias":0} at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:270) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:460) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:407) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"joinkey0":"00:00:00:00:00:00"},"value":{"_col2":"a10000277b93d8","_col3":"00:00:00:00:00:00"},"alias":0} at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:258) ... 7 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.hive.ql.exec.JoinOperator.processOp(JoinOperator.java:134) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:474) at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:249) ... 7 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.spillBlock(RowContainer.java:353) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.add(RowContainer.java:164) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.add(RowContainer.java:74) at org.apache.hadoop.hive.ql.exec.JoinOperator.processOp(JoinOperator.java:131) ... 9 more Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at org.apache.hadoop.fs.Path.initialize(Path.java:157) at org.apache.hadoop.fs.Path.(Path.java:135) at org.apache.hadoop.fs.Path.(Path.java:58) at org.apache.hadoop.fs.ChecksumFileSystem.getChecksumFile(ChecksumFileSystem.java:83) at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:394) at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:439) at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886) at org.apache.hadoop.io.SequenceFile$Writer.(SequenceFile.java:1060) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:270) at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:369) at org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:989) at org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64) at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:250) at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.spillBlock(RowContainer.java:318) ... 12 more Caused by: java.net.URISyntaxException: Relative path in absolute URI: .RowContainer2188029813644287132.[00:00:00:00:00:00%5D.tmp.crc at java.net.URI.checkPath(URI.java:1788) at java.net.URI.(URI.java:734) at org.apache.hadoop.fs.Path.initialize(Path.java:154) ... 26 more 2013-12-06 14:05:00,721 INFO [main] org.apache.hadoop.mapred.Task: Runnning cleanup for the task 2013-12-06 14:05:00,757 ERROR [Thread-3] org.apache.hadoop.hdfs.DFSClient: Failed to close file /tmp/hive-beeswax-kpi/hive_2013-12-06_13-56-49_404_7454927883229301634/_task_tmp.-mr-10006/_tmp.000000_0 org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException): No lease on /tmp/hive-beeswax-kpi/hive_2013-12-06_13-56-49_404_7454927883229301634/_task_tmp.-mr-10006/_tmp.000000_0 File does not exist. Holder DFSClient_attempt_1384857622207_297273_r_000000_0_1249751644_1 does not have any open files. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2419) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:2410) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFileInternal(FSNamesystem.java:2478) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFile(FSNamesystem.java:2455) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.complete(NameNodeRpcServer.java:535) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.complete(ClientNamenodeProtocolServerSideTranslatorPB.java:335) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44084) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689) at org.apache.hadoop.ipc.Client.call(Client.java:1225) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) at $Proxy10.complete(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.complete(ClientNamenodeProtocolTranslatorPB.java:329) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) at $Proxy11.complete(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:1795) at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:1782) at org.apache.hadoop.hdfs.DFSClient.closeAllFilesBeingWritten(DFSClient.java:709) at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:726) at org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:562) at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2387) at org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2403) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) 2013-12-06 14:05:00,825 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping ReduceTask metrics system... 2013-12-06 14:05:00,826 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ReduceTask metrics system stopped. 2013-12-06 14:05:00,826 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ReduceTask metrics system shutdown complete.