visualmeta@test-232:~/hbase_export$ /var/vm/apps/hbase-0.98.9-hadoop2/bin/hbase org.apache.hadoop.hbase.mapreduce.Import item_restore /data/item_backup > error SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/var/vm/apps/hbase-0.98.9-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2015-06-15 10:36:25,989 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2015-06-15 10:36:26,573 INFO [main] Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name 2015-06-15 10:36:26,574 INFO [main] Configuration.deprecation: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class 2015-06-15 10:36:26,575 INFO [main] Configuration.deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir 2015-06-15 10:36:26,576 INFO [main] Configuration.deprecation: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class 2015-06-15 10:36:26,577 INFO [main] Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 2015-06-15 10:36:26,579 INFO [main] Configuration.deprecation: mapred.job.tracker.persist.jobstatus.hours is deprecated. Instead, use mapreduce.jobtracker.persist.jobstatus.hours 2015-06-15 10:36:26,579 INFO [main] Configuration.deprecation: mapred.heartbeats.in.second is deprecated. Instead, use mapreduce.jobtracker.heartbeats.in.second 2015-06-15 10:36:26,579 INFO [main] Configuration.deprecation: topology.node.switch.mapping.impl is deprecated. Instead, use net.topology.node.switch.mapping.impl 2015-06-15 10:36:26,579 INFO [main] Configuration.deprecation: dfs.access.time.precision is deprecated. Instead, use dfs.namenode.accesstime.precision 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: mapred.skip.map.max.skip.records is deprecated. Instead, use mapreduce.map.skip.maxrecords 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: mapred.job.tracker.jobhistory.lru.cache.size is deprecated. Instead, use mapreduce.jobtracker.jobhistory.lru.cache.size 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: mapred.skip.attempts.to.start.skipping is deprecated. Instead, use mapreduce.task.skip.start.attempts 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: dfs.safemode.threshold.pct is deprecated. Instead, use dfs.namenode.safemode.threshold-pct 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: mapred.tasktracker.map.tasks.maximum is deprecated. Instead, use mapreduce.tasktracker.map.tasks.maximum 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: dfs.datanode.max.xcievers is deprecated. Instead, use dfs.datanode.max.transfer.threads 2015-06-15 10:36:26,580 INFO [main] Configuration.deprecation: mapred.map.child.log.level is deprecated. Instead, use mapreduce.map.log.level 2015-06-15 10:36:26,581 INFO [main] Configuration.deprecation: mapred.local.dir.minspacestart is deprecated. Instead, use mapreduce.tasktracker.local.dir.minspacestart 2015-06-15 10:36:26,581 INFO [main] Configuration.deprecation: mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used 2015-06-15 10:36:26,581 INFO [main] Configuration.deprecation: dfs.safemode.extension is deprecated. Instead, use dfs.namenode.safemode.extension 2015-06-15 10:36:26,581 INFO [main] Configuration.deprecation: tasktracker.http.threads is deprecated. Instead, use mapreduce.tasktracker.http.threads 2015-06-15 10:36:26,581 INFO [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available 2015-06-15 10:36:26,581 INFO [main] Configuration.deprecation: mapred.job.reduce.input.buffer.percent is deprecated. Instead, use mapreduce.reduce.input.buffer.percent 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.job.shuffle.input.buffer.percent is deprecated. Instead, use mapreduce.reduce.shuffle.input.buffer.percent 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: job.end.retry.interval is deprecated. Instead, use mapreduce.job.end-notification.retry.interval 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.temp.dir is deprecated. Instead, use mapreduce.cluster.temp.dir 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.acls.enabled is deprecated. Instead, use mapreduce.cluster.acls.enabled 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: io.sort.spill.percent is deprecated. Instead, use mapreduce.map.sort.spill.percent 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.tasktracker.instrumentation is deprecated. Instead, use mapreduce.tasktracker.instrumentation 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.committer.job.setup.cleanup.needed is deprecated. Instead, use mapreduce.job.committer.setup.cleanup.needed 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.system.dir is deprecated. Instead, use mapreduce.jobtracker.system.dir 2015-06-15 10:36:26,582 INFO [main] Configuration.deprecation: mapred.jobtracker.instrumentation is deprecated. Instead, use mapreduce.jobtracker.instrumentation 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: mapred.jobtracker.maxtasks.per.job is deprecated. Instead, use mapreduce.jobtracker.maxtasks.perjob 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: mapred.task.tracker.task-controller is deprecated. Instead, use mapreduce.tasktracker.taskcontroller 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: mapred.reduce.parallel.copies is deprecated. Instead, use mapreduce.reduce.shuffle.parallelcopies 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: io.sort.factor is deprecated. Instead, use mapreduce.task.io.sort.factor 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: mapred.shuffle.read.timeout is deprecated. Instead, use mapreduce.reduce.shuffle.read.timeout 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: mapred.tasktracker.indexcache.mb is deprecated. Instead, use mapreduce.tasktracker.indexcache.mb 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: mapred.max.tracker.blacklists is deprecated. Instead, use mapreduce.jobtracker.tasktracker.maxblacklists 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb 2015-06-15 10:36:26,583 INFO [main] Configuration.deprecation: topology.script.number.args is deprecated. Instead, use net.topology.script.number.args 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: dfs.http.address is deprecated. Instead, use dfs.namenode.http-address 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.tasktracker.dns.interface is deprecated. Instead, use mapreduce.tasktracker.dns.interface 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.job.tracker.persist.jobstatus.active is deprecated. Instead, use mapreduce.jobtracker.persist.jobstatus.active 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.compress.map.output is deprecated. Instead, use mapreduce.map.output.compress 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.task.cache.levels is deprecated. Instead, use mapreduce.jobtracker.taskcache.levels 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: topology.script.file.name is deprecated. Instead, use net.topology.script.file.name 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.tasktracker.taskmemorymanager.monitoring-interval is deprecated. Instead, use mapreduce.tasktracker.taskmemorymanager.monitoringinterval 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: dfs.umaskmode is deprecated. Instead, use fs.permissions.umask-mode 2015-06-15 10:36:26,584 INFO [main] Configuration.deprecation: mapred.output.compression.codec is deprecated. Instead, use mapreduce.output.fileoutputformat.compress.codec 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: jobclient.output.filter is deprecated. Instead, use mapreduce.client.output.filter 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.output.compression.type is deprecated. Instead, use mapreduce.output.fileoutputformat.compress.type 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.job.reuse.jvm.num.tasks is deprecated. Instead, use mapreduce.job.jvm.numtasks 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.reduce.max.attempts is deprecated. Instead, use mapreduce.reduce.maxattempts 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.healthChecker.script.timeout is deprecated. Instead, use mapreduce.tasktracker.healthchecker.script.timeout 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.userlog.limit.kb is deprecated. Instead, use mapreduce.task.userlog.limit.kb 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: mapred.userlog.retain.hours is deprecated. Instead, use mapreduce.job.userlog.retain.hours 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: dfs.max.objects is deprecated. Instead, use dfs.namenode.max.objects 2015-06-15 10:36:26,585 INFO [main] Configuration.deprecation: dfs.name.edits.dir is deprecated. Instead, use dfs.namenode.edits.dir 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: dfs.replication.interval is deprecated. Instead, use dfs.namenode.replication.interval 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: jobclient.completion.poll.interval is deprecated. Instead, use mapreduce.client.completion.pollinterval 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: mapred.tasktracker.tasks.sleeptime-before-sigkill is deprecated. Instead, use mapreduce.tasktracker.tasks.sleeptimebeforesigkill 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: dfs.replication.considerLoad is deprecated. Instead, use dfs.namenode.replication.considerLoad 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: dfs.block.size is deprecated. Instead, use dfs.blocksize 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: dfs.permissions is deprecated. Instead, use dfs.permissions.enabled 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: mapred.submit.replication is deprecated. Instead, use mapreduce.client.submit.file.replication 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: dfs.https.client.keystore.resource is deprecated. Instead, use dfs.client.https.keystore.resource 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: job.end.retry.attempts is deprecated. Instead, use mapreduce.job.end-notification.retry.attempts 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: dfs.write.packet.size is deprecated. Instead, use dfs.client-write-packet-size 2015-06-15 10:36:26,586 INFO [main] Configuration.deprecation: mapred.reduce.slowstart.completed.maps is deprecated. Instead, use mapreduce.job.reduce.slowstart.completedmaps 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: dfs.backup.http.address is deprecated. Instead, use dfs.namenode.backup.http-address 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: mapred.task.profile.reduces is deprecated. Instead, use mapreduce.task.profile.reduces 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: mapred.job.queue.name is deprecated. Instead, use mapreduce.job.queuename 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: mapred.tasktracker.expiry.interval is deprecated. Instead, use mapreduce.jobtracker.expire.trackers.interval 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: keep.failed.task.files is deprecated. Instead, use mapreduce.task.files.preserve.failedtasks 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: mapred.max.tracker.failures is deprecated. Instead, use mapreduce.job.maxtaskfailures.per.tracker 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: dfs.replication.min is deprecated. Instead, use dfs.namenode.replication.min 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: dfs.name.dir is deprecated. Instead, use dfs.namenode.name.dir 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: mapred.local.dir is deprecated. Instead, use mapreduce.cluster.local.dir 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: mapred.child.tmp is deprecated. Instead, use mapreduce.task.tmp.dir 2015-06-15 10:36:26,587 INFO [main] Configuration.deprecation: fs.checkpoint.period is deprecated. Instead, use dfs.namenode.checkpoint.period 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.jobtracker.restart.recover is deprecated. Instead, use mapreduce.jobtracker.restart.recover 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.jobtracker.taskScheduler is deprecated. Instead, use mapreduce.jobtracker.taskscheduler 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapreduce.jobtracker.split.metainfo.maxsize is deprecated. Instead, use mapreduce.job.split.metainfo.maxsize 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.task.timeout is deprecated. Instead, use mapreduce.task.timeout 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.task.tracker.report.address is deprecated. Instead, use mapreduce.tasktracker.report.address 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: dfs.secondary.http.address is deprecated. Instead, use dfs.namenode.secondary.http-address 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: dfs.data.dir is deprecated. Instead, use dfs.datanode.data.dir 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.task.profile is deprecated. Instead, use mapreduce.task.profile 2015-06-15 10:36:26,588 INFO [main] Configuration.deprecation: mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.merge.recordsBeforeProgress is deprecated. Instead, use mapreduce.task.merge.progress.records 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.tasktracker.dns.nameserver is deprecated. Instead, use mapreduce.tasktracker.dns.nameserver 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.speculative.execution.slowTaskThreshold is deprecated. Instead, use mapreduce.job.speculative.slowtaskthreshold 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: dfs.name.dir.restore is deprecated. Instead, use dfs.namenode.name.dir.restore 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: dfs.df.interval is deprecated. Instead, use fs.df.interval 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.job.tracker.handler.count is deprecated. Instead, use mapreduce.jobtracker.handler.count 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.job.shuffle.merge.percent is deprecated. Instead, use mapreduce.reduce.shuffle.merge.percent 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.job.tracker.retiredjobs.cache.size is deprecated. Instead, use mapreduce.jobtracker.retiredjobs.cache.size 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapreduce.job.counters.limit is deprecated. Instead, use mapreduce.job.counters.max 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: mapred.skip.reduce.max.skip.groups is deprecated. Instead, use mapreduce.reduce.skip.maxgroups 2015-06-15 10:36:26,589 INFO [main] Configuration.deprecation: jobclient.progress.monitor.poll.interval is deprecated. Instead, use mapreduce.client.progressmonitor.pollinterval 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.map.max.attempts is deprecated. Instead, use mapreduce.map.maxattempts 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.task.profile.maps is deprecated. Instead, use mapreduce.task.profile.maps 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: dfs.permissions.supergroup is deprecated. Instead, use dfs.permissions.superusergroup 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: dfs.https.address is deprecated. Instead, use dfs.namenode.https-address 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.speculative.execution.slowNodeThreshold is deprecated. Instead, use mapreduce.job.speculative.slownodethreshold 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.job.tracker.persist.jobstatus.dir is deprecated. Instead, use mapreduce.jobtracker.persist.jobstatus.dir 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.task.tracker.http.address is deprecated. Instead, use mapreduce.tasktracker.http.address 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.inmem.merge.threshold is deprecated. Instead, use mapreduce.reduce.merge.inmem.threshold 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: dfs.backup.address is deprecated. Instead, use dfs.namenode.backup.address 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.map.output.compression.codec is deprecated. Instead, use mapreduce.map.output.compress.codec 2015-06-15 10:36:26,590 INFO [main] Configuration.deprecation: mapred.reduce.child.log.level is deprecated. Instead, use mapreduce.reduce.log.level 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: fs.checkpoint.edits.dir is deprecated. Instead, use dfs.namenode.checkpoint.edits.dir 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: dfs.federation.nameservices is deprecated. Instead, use dfs.nameservices 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: mapred.healthChecker.interval is deprecated. Instead, use mapreduce.tasktracker.healthchecker.interval 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: mapred.speculative.execution.speculativeCap is deprecated. Instead, use mapreduce.job.speculative.speculativecap 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: fs.checkpoint.dir is deprecated. Instead, use dfs.namenode.checkpoint.dir 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: dfs.balance.bandwidthPerSec is deprecated. Instead, use dfs.datanode.balance.bandwidthPerSec 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: mapred.shuffle.connect.timeout is deprecated. Instead, use mapreduce.reduce.shuffle.connect.timeout 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: dfs.https.need.client.auth is deprecated. Instead, use dfs.client.https.need-auth 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: mapred.jobtracker.job.history.block.size is deprecated. Instead, use mapreduce.jobtracker.jobhistory.block.size 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: mapred.tasktracker.reduce.tasks.maximum is deprecated. Instead, use mapreduce.tasktracker.reduce.tasks.maximum 2015-06-15 10:36:26,591 INFO [main] Configuration.deprecation: mapred.local.dir.minspacekill is deprecated. Instead, use mapreduce.tasktracker.local.dir.minspacekill 2015-06-15 10:36:26,725 INFO [main] client.RMProxy: Connecting to ResourceManager at /10.1.10.234:8032 2015-06-15 10:36:26,823 INFO [main] Configuration.deprecation: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class 2015-06-15 10:36:26,824 INFO [main] Configuration.deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum 2015-06-15 10:36:26,825 INFO [main] Configuration.deprecation: mapreduce.job.counters.limit is deprecated. Instead, use mapreduce.job.counters.max 2015-06-15 10:36:26,825 INFO [main] Configuration.deprecation: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class 2015-06-15 10:36:26,825 INFO [main] Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS 2015-06-15 10:36:26,825 INFO [main] Configuration.deprecation: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class 2015-06-15 10:36:26,914 INFO [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x2b175c00 connecting to ZooKeeper ensemble=10.1.10.234:2181,10.1.10.232:2181,10.1.10.236:2181 2015-06-15 10:36:26,919 INFO [main] zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT 2015-06-15 10:36:26,920 INFO [main] zookeeper.ZooKeeper: Client environment:host.name=test-232 2015-06-15 10:36:26,920 INFO [main] zookeeper.ZooKeeper: Client environment:java.version=1.8.0_31 2015-06-15 10:36:26,920 INFO [main] zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation 2015-06-15 10:36:26,920 INFO [main] zookeeper.ZooKeeper: Client environment:java.home=/opt/java/x64/jdk1.8.0_31/jre 2015-06-15 10:36:26,920 INFO [main] zookeeper.ZooKeeper: Client environment:java.class.path=/var/vm/apps/hbase-0.98.9-hadoop2/bin/../conf:/opt/java/x64/jdk1.8.0_31//lib/tools.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/..:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/activation-1.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/aopalliance-1.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/asm-3.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/avro-1.7.4.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-beanutils-1.7.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-beanutils-core-1.8.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-cli-1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-codec-1.7.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-collections-3.2.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-compress-1.4.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-configuration-1.6.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-daemon-1.0.13.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-digester-1.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-el-1.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-httpclient-3.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-io-2.4.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-lang-2.6.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-logging-1.1.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-math-2.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/commons-net-3.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/findbugs-annotations-1.3.9-1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/grizzly-framework-2.1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/grizzly-http-2.1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/grizzly-http-server-2.1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/grizzly-http-servlet-2.1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/grizzly-rcm-2.1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/guava-12.0.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/guice-3.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/guice-servlet-3.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-annotations-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-auth-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-client-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-common-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-hdfs-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-mapreduce-client-app-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-mapreduce-client-common-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-mapreduce-client-core-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-mapreduce-client-jobclient-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-mapreduce-client-shuffle-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-yarn-api-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-yarn-client-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-yarn-common-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-yarn-server-common-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hadoop-yarn-server-nodemanager-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hamcrest-core-1.3.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-annotations-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-checkstyle-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-client-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-common-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-common-0.98.9-hadoop2-tests.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-examples-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-hadoop-compat-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-it-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-it-0.98.9-hadoop2-tests.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-prefix-tree-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-protocol-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-rest-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-server-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-server-0.98.9-hadoop2-tests.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-shell-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-testing-util-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/hbase-thrift-0.98.9-hadoop2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/high-scale-lib-1.1.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/htrace-core-2.04.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/httpclient-4.1.3.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/httpcore-4.1.3.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jackson-core-asl-1.8.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jackson-jaxrs-1.8.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jackson-mapper-asl-1.8.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jackson-xc-1.8.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jamon-runtime-2.3.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jasper-compiler-5.5.23.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jasper-runtime-5.5.23.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/javax.inject-1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/javax.servlet-3.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/javax.servlet-api-3.0.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jaxb-api-2.2.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jaxb-impl-2.2.3-1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jcodings-1.0.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-client-1.9.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-core-1.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-grizzly2-1.9.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-guice-1.9.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-json-1.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-server-1.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-test-framework-core-1.9.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jersey-test-framework-grizzly2-1.9.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jets3t-0.6.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jettison-1.3.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jetty-6.1.26.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jetty-sslengine-6.1.26.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jetty-util-6.1.26.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/joni-2.1.2.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jruby-complete-1.6.8.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jsch-0.1.42.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jsp-2.1-6.1.14.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jsp-api-2.1-6.1.14.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/jsr305-1.3.9.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/junit-4.11.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/libthrift-0.9.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/log4j-1.2.17.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/management-api-3.0.0-b012.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/metrics-core-2.2.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/netty-3.6.6.Final.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/paranamer-2.3.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/protobuf-java-2.5.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/servlet-api-2.5-6.1.14.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/slf4j-api-1.6.4.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/slf4j-log4j12-1.6.4.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/snappy-java-1.0.4.1.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/xmlenc-0.52.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/xz-1.0.jar:/var/vm/apps/hbase-0.98.9-hadoop2/bin/../lib/zookeeper-3.4.6.jar:/var/vm/apps/hadoop-2.6.0/etc/hadoop:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/curator-recipes-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jets3t-0.9.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-el-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jsp-api-2.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-lang-2.6.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/xz-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/guava-11.0.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/htrace-core-3.0.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/xmlenc-0.52.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/curator-client-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/servlet-api-2.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/stax-api-1.0-2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/asm-3.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jettison-1.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-io-2.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-digester-1.8.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/httpclient-4.2.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/curator-framework-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/avro-1.7.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/activation-1.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/gson-2.2.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-net-3.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/log4j-1.2.17.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jsch-0.1.42.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/junit-4.11.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-cli-1.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jersey-json-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-codec-1.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/hamcrest-core-1.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-logging-1.1.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/paranamer-2.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jersey-server-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jetty-6.1.26.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/commons-math3-3.1.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/httpcore-4.2.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/jersey-core-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/lib/zookeeper-3.4.6.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/hadoop-nfs-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-tests.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/asm-3.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-io-2.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-client-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-lang-2.6.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jline-0.9.94.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/xz-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/guava-11.0.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/guice-3.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/servlet-api-2.5.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/asm-3.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jettison-1.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-io-2.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/activation-1.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-cli-1.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-json-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-codec-1.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/javax.inject-1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jetty-6.1.26.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-registry-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-client-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/junit-4.11.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-tests.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/var/vm/apps/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6.0.jar:/var/vm/apps/hadoop-2.6.0//contrib/capacity-scheduler/*.jar 2015-06-15 10:36:26,920 INFO [main] zookeeper.ZooKeeper: Client environment:java.library.path=/var/vm/apps/hadoop-2.6.0/lib/native 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:java.compiler= 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:os.name=Linux 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:os.arch=amd64 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:os.version=3.2.0-4-amd64 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:user.name=visualmeta 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:user.home=/home/visualmeta 2015-06-15 10:36:26,921 INFO [main] zookeeper.ZooKeeper: Client environment:user.dir=/home/visualmeta/hbase_export 2015-06-15 10:36:26,922 INFO [main] zookeeper.ZooKeeper: Initiating client connection, connectString=10.1.10.234:2181,10.1.10.232:2181,10.1.10.236:2181 sessionTimeout=90000 watcher=hconnection-0x2b175c00, quorum=10.1.10.234:2181,10.1.10.232:2181,10.1.10.236:2181, baseZNode=/hbase 2015-06-15 10:36:26,943 INFO [main-SendThread(10.1.10.234:2181)] zookeeper.ClientCnxn: Opening socket connection to server 10.1.10.234/10.1.10.234:2181. Will not attempt to authenticate using SASL (unknown error) 2015-06-15 10:36:27,013 INFO [main-SendThread(10.1.10.234:2181)] zookeeper.ClientCnxn: Socket connection established to 10.1.10.234/10.1.10.234:2181, initiating session 2015-06-15 10:36:27,028 INFO [main-SendThread(10.1.10.234:2181)] zookeeper.ClientCnxn: Session establishment complete on server 10.1.10.234/10.1.10.234:2181, sessionid = 0x24de8105ef30317, negotiated timeout = 40000 2015-06-15 10:36:27,142 INFO [main] mapreduce.TableOutputFormat: Created table instance for item_restore 2015-06-15 10:36:29,050 INFO [main] input.FileInputFormat: Total input paths to process : 21 2015-06-15 10:36:29,251 INFO [main] mapreduce.JobSubmitter: number of splits:401 2015-06-15 10:36:29,282 INFO [main] Configuration.deprecation: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files 2015-06-15 10:36:29,282 INFO [main] Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir 2015-06-15 10:36:29,283 INFO [main] Configuration.deprecation: mapred.job.classpath.files is deprecated. Instead, use mapreduce.job.classpath.files 2015-06-15 10:36:29,283 INFO [main] Configuration.deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum 2015-06-15 10:36:29,283 INFO [main] Configuration.deprecation: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps 2015-06-15 10:36:29,283 INFO [main] Configuration.deprecation: mapreduce.job.counters.limit is deprecated. Instead, use mapreduce.job.counters.max 2015-06-15 10:36:29,284 INFO [main] Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name 2015-06-15 10:36:29,284 INFO [main] Configuration.deprecation: mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes 2015-06-15 10:36:29,284 INFO [main] Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS 2015-06-15 10:36:29,399 INFO [main] mapreduce.JobSubmitter: Submitting tokens for job: job_1434117507786_0029 2015-06-15 10:36:29,598 INFO [main] impl.YarnClientImpl: Submitted application application_1434117507786_0029 to ResourceManager at /10.1.10.234:8032 2015-06-15 10:36:29,627 INFO [main] mapreduce.Job: The url to track the job: http://http://test-234:8088/proxy/application_1434117507786_0029/ 2015-06-15 10:36:29,628 INFO [main] mapreduce.Job: Running job: job_1434117507786_0029 2015-06-15 10:36:35,873 INFO [main] mapreduce.Job: Job job_1434117507786_0029 running in uber mode : false 2015-06-15 10:36:35,874 INFO [main] mapreduce.Job: map 0% reduce 0% 2015-06-15 10:36:43,994 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000000_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 765 actions: item_restore: 765 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,017 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000007_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 450 actions: item_restore: 450 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,021 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000026_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 589 actions: item_restore: 589 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,024 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000016_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 612 actions: item_restore: 612 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,026 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000002_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 634 actions: item_restore: 634 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,028 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000003_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 605 actions: item_restore: 605 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,031 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000001_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 717 actions: item_restore: 717 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,033 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000008_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 561 actions: item_restore: 561 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,035 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000046_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 669 actions: item_restore: 669 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:44,044 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000006_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 767 actions: item_restore: 767 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:45,061 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000009_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 556 actions: item_restore: 556 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:45,064 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000036_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 864 actions: item_restore: 864 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Container killed by the ApplicationMaster. 2015-06-15 10:36:45,066 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000005_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 632 actions: item_restore: 632 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:45,068 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000004_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 884 actions: item_restore: 884 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:45,069 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000012_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 635 actions: item_restore: 635 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:45,072 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000057_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 659 actions: item_restore: 659 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:45,076 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000027_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 617 actions: item_restore: 617 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:46,087 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000029_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 509 actions: item_restore: 509 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:46,089 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000059_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 604 actions: item_restore: 604 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:46,095 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000019_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 615 actions: item_restore: 615 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:46,097 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000013_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 624 actions: item_restore: 624 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:47,107 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000010_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 604 actions: item_restore: 604 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:47,109 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000014_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 758 actions: item_restore: 758 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:47,112 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000011_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 681 actions: item_restore: 681 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:47,113 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000060_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 619 actions: item_restore: 619 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:47,117 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000022_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 695 actions: item_restore: 695 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:47,123 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000034_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 769 actions: item_restore: 769 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:48,144 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000040_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 490 actions: item_restore: 490 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:48,151 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000068_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 677 actions: item_restore: 677 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:48,155 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000015_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 758 actions: item_restore: 758 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:48,157 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000031_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 651 actions: item_restore: 651 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:49,178 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000018_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 531 actions: item_restore: 531 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:49,180 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000049_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 635 actions: item_restore: 635 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:49,181 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000032_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 607 actions: item_restore: 607 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:49,183 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000017_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 778 actions: item_restore: 778 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,205 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000033_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 652 actions: item_restore: 652 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,208 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000072_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 660 actions: item_restore: 660 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,211 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000030_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 628 actions: item_restore: 628 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,213 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000020_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 666 actions: item_restore: 666 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,215 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000021_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 609 actions: item_restore: 609 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,219 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000054_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 594 actions: item_restore: 594 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:50,224 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000037_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 622 actions: item_restore: 622 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,252 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000050_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 748 actions: item_restore: 748 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,260 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000035_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 602 actions: item_restore: 602 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,262 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000075_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 665 actions: item_restore: 665 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,266 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000056_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 784 actions: item_restore: 784 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,274 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000007_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 450 actions: item_restore: 450 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,280 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000039_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 847 actions: item_restore: 847 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,285 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000069_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 640 actions: item_restore: 640 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:51,291 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000070_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 685 actions: item_restore: 685 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,312 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000081_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 747 actions: item_restore: 747 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,314 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000016_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 612 actions: item_restore: 612 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,317 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000076_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 628 actions: item_restore: 628 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,319 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000042_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 598 actions: item_restore: 598 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,321 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000026_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 589 actions: item_restore: 589 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,322 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000001_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 717 actions: item_restore: 717 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:52,324 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000023_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 765 actions: item_restore: 765 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:53,334 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000024_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 636 actions: item_restore: 636 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:53,337 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000047_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 724 actions: item_restore: 724 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:53,339 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000027_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 617 actions: item_restore: 617 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:54,350 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000005_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 632 actions: item_restore: 632 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:54,352 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000004_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 884 actions: item_restore: 884 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,361 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000028_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 717 actions: item_restore: 717 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,364 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000045_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 773 actions: item_restore: 773 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,366 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000060_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 619 actions: item_restore: 619 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,370 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000043_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 574 actions: item_restore: 574 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,373 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000048_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 618 actions: item_restore: 618 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,376 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000022_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 695 actions: item_restore: 695 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,380 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000010_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 604 actions: item_restore: 604 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,383 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000071_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 692 actions: item_restore: 692 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:55,387 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000013_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 624 actions: item_restore: 624 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:56,401 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000025_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 809 actions: item_restore: 809 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:56,406 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000014_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 758 actions: item_restore: 758 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:56,407 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000003_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 605 actions: item_restore: 605 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:56,414 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000079_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 890 actions: item_restore: 890 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:56,420 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000051_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 516 actions: item_restore: 516 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,436 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000068_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 677 actions: item_restore: 677 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,438 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000041_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 585 actions: item_restore: 585 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,442 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000052_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 580 actions: item_restore: 580 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,445 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000031_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 651 actions: item_restore: 651 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,451 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000044_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 686 actions: item_restore: 686 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,455 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000038_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 655 actions: item_restore: 655 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:57,458 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000032_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 607 actions: item_restore: 607 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:58,474 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000053_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 479 actions: item_restore: 479 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:58,476 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000058_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 859 actions: item_restore: 859 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:58,477 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000049_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 635 actions: item_restore: 635 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:58,483 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000082_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 651 actions: item_restore: 651 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:58,485 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000078_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 670 actions: item_restore: 670 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,496 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000084_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 653 actions: item_restore: 653 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,497 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000054_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 594 actions: item_restore: 594 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,499 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000002_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 634 actions: item_restore: 634 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,500 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000083_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 674 actions: item_restore: 674 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,501 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000072_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 660 actions: item_restore: 660 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,502 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000087_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 655 actions: item_restore: 655 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,503 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000080_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 699 actions: item_restore: 699 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:36:59,504 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000037_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 622 actions: item_restore: 622 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,519 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000085_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 648 actions: item_restore: 648 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,522 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000035_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 602 actions: item_restore: 602 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,526 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000066_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 692 actions: item_restore: 692 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,528 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000046_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 669 actions: item_restore: 669 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,530 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000000_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 765 actions: item_restore: 765 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,534 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000069_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 640 actions: item_restore: 640 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,538 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000016_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 612 actions: item_restore: 612 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,544 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000009_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 556 actions: item_restore: 556 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,547 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000008_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 561 actions: item_restore: 561 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:00,552 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000089_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 628 actions: item_restore: 628 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:01,569 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000029_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 509 actions: item_restore: 509 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:01,571 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000039_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 847 actions: item_restore: 847 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:01,573 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000086_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 706 actions: item_restore: 706 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:01,574 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000026_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 589 actions: item_restore: 589 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:01,579 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000081_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 747 actions: item_restore: 747 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:02,588 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000094_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 681 actions: item_restore: 681 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:02,590 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000076_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 628 actions: item_restore: 628 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:02,593 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000019_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 615 actions: item_restore: 615 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:02,595 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000001_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 717 actions: item_restore: 717 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,621 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000012_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 635 actions: item_restore: 635 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,623 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000057_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 659 actions: item_restore: 659 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,624 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000006_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 767 actions: item_restore: 767 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,626 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000097_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 744 actions: item_restore: 744 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,632 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000092_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 779 actions: item_restore: 779 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,636 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000096_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 660 actions: item_restore: 660 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:03,640 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000122_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 505 actions: item_restore: 505 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:04,660 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000090_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 768 actions: item_restore: 768 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Container killed by the ApplicationMaster. 2015-06-15 10:37:04,665 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000099_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 674 actions: item_restore: 674 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:04,671 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000103_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 694 actions: item_restore: 694 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:04,676 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000059_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 604 actions: item_restore: 604 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,689 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000043_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 574 actions: item_restore: 574 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,692 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000124_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 541 actions: item_restore: 541 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Container killed by the ApplicationMaster. 2015-06-15 10:37:05,695 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000040_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 490 actions: item_restore: 490 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,696 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000104_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 640 actions: item_restore: 640 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,700 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000028_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 717 actions: item_restore: 717 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,710 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000015_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 758 actions: item_restore: 758 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,718 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000034_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 769 actions: item_restore: 769 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,729 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000110_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 786 actions: item_restore: 786 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,741 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000107_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 697 actions: item_restore: 697 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:05,755 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000018_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 531 actions: item_restore: 531 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,784 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000108_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 852 actions: item_restore: 852 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,786 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000071_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 692 actions: item_restore: 692 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,787 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000010_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 604 actions: item_restore: 604 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,788 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000062_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 534 actions: item_restore: 534 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,789 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000036_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 864 actions: item_restore: 864 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,791 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000067_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 702 actions: item_restore: 702 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,792 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000025_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 809 actions: item_restore: 809 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,794 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000041_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 585 actions: item_restore: 585 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,797 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000061_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 800 actions: item_restore: 800 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:06,799 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000079_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 890 actions: item_restore: 890 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:07,815 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000032_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 607 actions: item_restore: 607 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:07,817 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000011_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 681 actions: item_restore: 681 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:07,820 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000044_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 686 actions: item_restore: 686 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:07,829 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000117_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 610 actions: item_restore: 610 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,854 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000054_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 594 actions: item_restore: 594 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,856 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000138_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 635 actions: item_restore: 635 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,870 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000129_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 723 actions: item_restore: 723 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,872 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000084_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 653 actions: item_restore: 653 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,874 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000064_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 556 actions: item_restore: 556 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,876 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000082_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 651 actions: item_restore: 651 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,879 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000073_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 709 actions: item_restore: 709 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,882 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000049_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 635 actions: item_restore: 635 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:08,884 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000065_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 576 actions: item_restore: 576 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,900 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000140_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 529 actions: item_restore: 529 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,901 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000134_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 513 actions: item_restore: 513 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,903 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000080_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 699 actions: item_restore: 699 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,904 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000087_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 655 actions: item_restore: 655 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,906 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000055_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 678 actions: item_restore: 678 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,908 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000033_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 652 actions: item_restore: 652 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:09,909 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000139_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 645 actions: item_restore: 645 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,919 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000072_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 660 actions: item_restore: 660 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,921 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000116_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 774 actions: item_restore: 774 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,923 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000030_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 628 actions: item_restore: 628 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,925 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000095_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 729 actions: item_restore: 729 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,927 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000008_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 561 actions: item_restore: 561 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,929 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000149_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 591 actions: item_restore: 591 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:10,933 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000020_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 666 actions: item_restore: 666 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,950 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000009_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 556 actions: item_restore: 556 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,952 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000075_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 665 actions: item_restore: 665 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,954 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000021_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 609 actions: item_restore: 609 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,958 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000066_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 692 actions: item_restore: 692 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,964 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000035_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 602 actions: item_restore: 602 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,967 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000007_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 450 actions: item_restore: 450 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,968 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000050_1, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 748 actions: item_restore: 748 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:11,970 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000069_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 640 actions: item_restore: 640 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:12,980 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000063_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 592 actions: item_restore: 592 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:12,982 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000019_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 615 actions: item_restore: 615 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:12,983 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000088_0, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 641 actions: item_restore: 641 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:12,985 INFO [main] mapreduce.Job: Task Id : attempt_1434117507786_0029_m_000029_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 509 actions: item_restore: 509 times, at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.makeException(AsyncProcess.java:203) at org.apache.hadoop.hbase.client.AsyncProcess$BatchErrors.access$500(AsyncProcess.java:187) at org.apache.hadoop.hbase.client.AsyncProcess.getErrors(AsyncProcess.java:922) at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:1017) at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:980) at org.apache.hadoop.hbase.client.HTable.put(HTable.java:941) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:126) at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at org.apache.hadoop.hbase.mapreduce.Import$Importer.processKV(Import.java:202) at org.apache.hadoop.hbase.mapreduce.Import$Importer.writeResult(Import.java:157) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:142) at org.apache.hadoop.hbase.mapreduce.Import$Importer.map(Import.java:125) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 2015-06-15 10:37:13,990 INFO [main] mapreduce.Job: map 100% reduce 0% 2015-06-15 10:37:16,013 INFO [main] mapreduce.Job: Job job_1434117507786_0029 failed with state FAILED due to: Task failed task_1434117507786_0029_m_000026 Job failed as tasks failed. failedMaps:1 failedReduces:0 Exception in thread "main" java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS at java.lang.Enum.valueOf(Enum.java:238) at org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.valueOf(FrameworkCounterGroup.java:148) at org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.findCounter(FrameworkCounterGroup.java:182) at org.apache.hadoop.mapreduce.counters.AbstractCounters.findCounter(AbstractCounters.java:154) at org.apache.hadoop.mapreduce.TypeConverter.fromYarn(TypeConverter.java:240) at org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounters(ClientServiceDelegate.java:370) at org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.java:511) at org.apache.hadoop.mapreduce.Job$7.run(Job.java:756) at org.apache.hadoop.mapreduce.Job$7.run(Job.java:753) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753) at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1361) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1289) at org.apache.hadoop.hbase.mapreduce.Import.main(Import.java:535) visualmeta@test-232:~/hbase_export$ s