Testsuite: org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver Tests run: 5, Failures: 1, Errors: 0, Time elapsed: 111.583 sec ------------- Standard Output --------------- Cluster work directory: /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091 Generating rack names for tasktrackers Generating host names for tasktrackers Trying to cleanup: /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091 ------------- ---------------- --------------- ------------- Standard Error ----------------- 11/11/22 21:37:33 INFO delegation.AbstractDelegationTokenSecretManager: Updating the current master key for generating delegation tokens 11/11/22 21:37:33 INFO mapred.JobTracker: Scheduler configured with (memSizeForMapSlotOnJT, memSizeForReduceSlotOnJT, limitMaxMemForMapTasks, limitMaxMemForReduceTasks) (-1, -1, -1, -1) 11/11/22 21:37:33 INFO util.HostsFileReader: Refreshing hosts (include/exclude) list 11/11/22 21:37:33 INFO delegation.AbstractDelegationTokenSecretManager: Starting expired delegation token remover thread, tokenRemoverScanInterval=60 min(s) 11/11/22 21:37:33 INFO delegation.AbstractDelegationTokenSecretManager: Updating the current master key for generating delegation tokens 11/11/22 21:37:33 INFO mapred.JobTracker: Starting jobtracker with owner as hortonas 11/11/22 21:37:33 INFO metrics.RpcMetrics: Initializing RPC Metrics with hostName=JobTracker, port=43643 11/11/22 21:37:33 INFO metrics.RpcDetailedMetrics: Initializing RPC Metrics with hostName=JobTracker, port=43643 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:37:33 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 11/11/22 21:37:33 INFO http.HttpServer: Added global filtersafety (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter) 11/11/22 21:37:33 INFO http.HttpServer: Port returned by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on 0 11/11/22 21:37:33 INFO http.HttpServer: listener.getLocalPort() returned 37481 webServer.getConnectors()[0].getLocalPort() returned 37481 11/11/22 21:37:33 INFO http.HttpServer: Jetty bound to port 37481 11/11/22 21:37:33 INFO mortbay.log: jetty-6.1.14 11/11/22 21:37:33 INFO mortbay.log: Extract jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar!/webapps/job to /tmp/Jetty_localhost_37481_job____.9bmokb/webapp 11/11/22 21:37:34 INFO mortbay.log: Started SelectChannelConnector@localhost:37481 11/11/22 21:37:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 11/11/22 21:37:34 INFO mapred.JobTracker: JobTracker up at: 43643 11/11/22 21:37:34 INFO mapred.JobTracker: JobTracker webserver: 37481 11/11/22 21:37:34 INFO mapred.JobTracker: Cleaning up the system directory 11/11/22 21:37:34 INFO mapred.JobHistory: Creating DONE folder at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:37:34 INFO mapred.CompletedJobStatusStore: Completed job store is inactive 11/11/22 21:37:34 INFO mapred.JobTracker: Refreshing hosts information 11/11/22 21:37:34 INFO util.HostsFileReader: Setting the includes file to 11/11/22 21:37:34 INFO util.HostsFileReader: Setting the excludes file to 11/11/22 21:37:34 INFO util.HostsFileReader: Refreshing hosts (include/exclude) list 11/11/22 21:37:34 INFO mapred.JobTracker: Decommissioning 0 nodes 11/11/22 21:37:34 INFO ipc.Server: IPC Server Responder: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server listener on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 0 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 1 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 2 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 3 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 4 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 5 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 6 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 7 on 43643: starting 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 8 on 43643: starting 11/11/22 21:37:34 INFO mapred.JobTracker: Starting RUNNING 11/11/22 21:37:34 INFO ipc.Server: IPC Server handler 9 on 43643: starting 11/11/22 21:37:35 INFO mapred.MiniMRCluster: mapred.local.dir is /tmp/hadoop-hortonas/mapred/local/0_0 11/11/22 21:37:35 INFO http.HttpServer: Added global filtersafety (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter) 11/11/22 21:37:35 INFO http.HttpServer: Port returned by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on 0 11/11/22 21:37:35 INFO http.HttpServer: listener.getLocalPort() returned 48812 webServer.getConnectors()[0].getLocalPort() returned 48812 11/11/22 21:37:35 INFO http.HttpServer: Jetty bound to port 48812 11/11/22 21:37:35 INFO mortbay.log: jetty-6.1.14 11/11/22 21:37:35 INFO mortbay.log: Extract jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar!/webapps/task to /tmp/Jetty_0_0_0_0_48812_task____lwic8j/webapp 11/11/22 21:37:35 INFO mortbay.log: Started SelectChannelConnector@0.0.0.0:48812 11/11/22 21:37:35 INFO mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1 11/11/22 21:37:35 INFO mapred.TaskTracker: Starting tasktracker with owner as hortonas 11/11/22 21:37:35 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=TaskTracker, sessionId= - already initialized 11/11/22 21:37:35 INFO metrics.RpcMetrics: Initializing RPC Metrics with hostName=TaskTracker, port=51131 11/11/22 21:37:35 INFO metrics.RpcDetailedMetrics: Initializing RPC Metrics with hostName=TaskTracker, port=51131 11/11/22 21:37:35 INFO ipc.Server: IPC Server Responder: starting 11/11/22 21:37:35 INFO ipc.Server: IPC Server listener on 51131: starting 11/11/22 21:37:35 INFO ipc.Server: IPC Server handler 0 on 51131: starting 11/11/22 21:37:35 INFO ipc.Server: IPC Server handler 1 on 51131: starting 11/11/22 21:37:35 INFO ipc.Server: IPC Server handler 2 on 51131: starting 11/11/22 21:37:35 INFO mapred.TaskTracker: TaskTracker up at: localhost/127.0.0.1:51131 11/11/22 21:37:35 INFO mapred.TaskTracker: Starting tracker tracker_host0.foo.com:localhost/127.0.0.1:51131 11/11/22 21:37:35 INFO ipc.Server: IPC Server handler 3 on 51131: starting 11/11/22 21:37:35 INFO mapred.TaskTracker: Starting thread: Map-events fetcher for all reduce tasks on tracker_host0.foo.com:localhost/127.0.0.1:51131 11/11/22 21:37:35 INFO mapred.TaskTracker: Using MemoryCalculatorPlugin : org.apache.hadoop.util.LinuxMemoryCalculatorPlugin@a68fd8 11/11/22 21:37:35 INFO util.ProcessTree: setsid exited with exit code 0 11/11/22 21:37:35 WARN mapred.TaskTracker: TaskTracker's totalMemoryAllottedForTasks is -1. TaskMemoryManager is disabled. 11/11/22 21:37:35 INFO mapred.IndexCache: IndexCache created with max memory = 10485760 11/11/22 21:37:35 INFO net.NetworkTopology: Adding a new node: /default-rack/host0.foo.com 11/11/22 21:37:35 INFO mapred.JobTracker: Adding tracker tracker_host0.foo.com:localhost/127.0.0.1:51131 to host host0.foo.com 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:zookeeper.version=3.3.1-942149, built on 05/07/2010 17:14 GMT 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:host.name=hrt7n35.cc1.ygridcore.net 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.version=1.6.0_05 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.vendor=Sun Microsystems Inc. 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.home=/usr/java/jdk1.6.0_05/jre 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.class.path=/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/test/classes:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/classes:/homes/hortonas/hcat/hcat-trunk/hive/external/conf:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/antlr-2.7.7.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/antlr-3.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/antlr-runtime-3.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/asm-3.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-cli-1.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-codec-1.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-collections-3.2.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-dbcp-1.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-lang-2.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-logging-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-logging-api-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-pool-1.5.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-connectionpool-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-core-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-enhancer-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-rdbms-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/derby-10.4.2.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/guava-r06.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/jdo2-api-2.3-ec.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/jline-0.9.94.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/json-20090211.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/libfb303-0.7.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/libthrift-0.7.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/log4j-1.2.16.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/mockito-all-1.8.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-api-1.6.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/stringtemplate-3.1-b1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/velocity-1.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/zookeeper-3.3.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/aspectjrt-1.6.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/aspectjtools-1.6.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-cli-1.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-codec-1.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-daemon-1.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-el-1.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-httpclient-3.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-logging-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-logging-api-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-net-1.4.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/core-3.1.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/hadoop-fairscheduler-0.20.3-CDH3-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/hsqldb-1.8.0.10.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jackson-core-asl-1.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jackson-mapper-asl-1.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jasper-compiler-5.5.12.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jasper-runtime-5.5.12.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jets3t-0.6.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jetty-6.1.14.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jetty-util-6.1.14.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jsp-2.1/jsp-2.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jsp-2.1/jsp-api-2.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/junit-4.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/kfs-0.2.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/log4j-1.2.15.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/mockito-all-1.8.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/mysql-connector-java-5.0.8-bin.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/oro-2.0.8.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/servlet-api-2.5-6.1.14.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-api-1.4.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/xmlenc-0.52.jar:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3-tests.jar:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3.jar:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/zookeeper-3.3.1.jar:/homes/hortonas/hcat/hcat-trunk/build/hcatalog/hcatalog-0.3.0-dev.jar:/homes/hortonas/hcat/hcat-trunk/build/hcatalog/hcatalog-server-extensions-0.3.0-dev.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/testlibs/ant-contrib-1.0b3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/testlibs/junit-4.10.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/lib/javaewah-0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/lib/log4j-1.2.15.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/cli/hive-cli-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/common/hive-common-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/serde/hive-serde-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ql/hive-exec-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/activemq-all-5.5.0.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/commons-cli-1.0.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/hadoop-tools-0.20.205.0.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/jdeb-0.8.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/jms-1.1.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/management-api-1.1-rev-1.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/pig-0.8.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hbase-handler/hive-hbase-handler-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-test-0.20.3-CDH3-SNAPSHOT.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/junit-4.8.1.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant-launcher.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant-junit.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant-junit4.jar 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.library.path=/usr/java/jdk1.6.0_05/jre/lib/i386/server:/usr/java/jdk1.6.0_05/jre/lib/i386:/usr/java/jdk1.6.0_05/jre/../lib/i386:/usr/java/packages/lib/i386:/lib:/usr/lib 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.io.tmpdir=/tmp 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:java.compiler= 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:os.name=Linux 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:os.arch=i386 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:os.version=2.6.18-238.1.1.el5.YAHOO.20110221 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:user.name=hortonas 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:user.home=/homes/hortonas 11/11/22 21:37:36 INFO server.ZooKeeperServer: Server environment:user.dir=/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase 11/11/22 21:37:36 INFO server.ZooKeeperServer: Created server with tickTime 2000 minSessionTimeout 4000 maxSessionTimeout 40000 datadir /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/zk/zookeeper/version-2 snapdir /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/zk/zookeeper/version-2 11/11/22 21:37:36 INFO server.NIOServerCnxn: binding to port 0.0.0.0/0.0.0.0:44589 11/11/22 21:37:36 INFO persistence.FileTxnSnapLog: Snapshotting: 0 11/11/22 21:37:36 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48537 11/11/22 21:37:36 INFO server.NIOServerCnxn: Processing stat command from /127.0.0.1:48537 11/11/22 21:37:36 INFO server.NIOServerCnxn: Stat command output 11/11/22 21:37:36 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48537 (no session established for client) 11/11/22 21:37:36 INFO zookeeper.MiniZooKeeperCluster: Started MiniZK Server on client port: 44589 11/11/22 21:37:36 INFO ipc.HBaseRpcMetrics: Initializing RPC Metrics with hostName=MiniHBaseCluster$MiniHBaseClusterMaster, port=51911 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server Responder: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server listener on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 0 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 1 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 2 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 3 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 4 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 5 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 6 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 7 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 8 on 51911: starting 11/11/22 21:37:36 INFO ipc.HBaseServer: IPC Server handler 9 on 51911: starting 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.3.1-942149, built on 05/07/2010 17:14 GMT 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:host.name=hrt7n35.cc1.ygridcore.net 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_05 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc. 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.6.0_05/jre 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/test/classes:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/classes:/homes/hortonas/hcat/hcat-trunk/hive/external/conf:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/antlr-2.7.7.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/antlr-3.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/antlr-runtime-3.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/asm-3.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-cli-1.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-codec-1.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-collections-3.2.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-dbcp-1.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-lang-2.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-logging-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-logging-api-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/commons-pool-1.5.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-connectionpool-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-core-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-enhancer-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/datanucleus-rdbms-2.0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/derby-10.4.2.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/guava-r06.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/jdo2-api-2.3-ec.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/jline-0.9.94.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/json-20090211.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/libfb303-0.7.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/libthrift-0.7.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/log4j-1.2.16.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/mockito-all-1.8.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-api-1.6.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/stringtemplate-3.1-b1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/velocity-1.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/zookeeper-3.3.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/aspectjrt-1.6.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/aspectjtools-1.6.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-cli-1.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-codec-1.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-daemon-1.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-el-1.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-httpclient-3.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-logging-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-logging-api-1.0.4.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/commons-net-1.4.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/core-3.1.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/hadoop-fairscheduler-0.20.3-CDH3-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/hsqldb-1.8.0.10.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jackson-core-asl-1.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jackson-mapper-asl-1.0.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jasper-compiler-5.5.12.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jasper-runtime-5.5.12.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jets3t-0.6.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jetty-6.1.14.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jetty-util-6.1.14.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jsp-2.1/jsp-2.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/jsp-2.1/jsp-api-2.1.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/junit-4.5.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/kfs-0.2.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/log4j-1.2.15.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/mockito-all-1.8.2.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/mysql-connector-java-5.0.8-bin.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/oro-2.0.8.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/servlet-api-2.5-6.1.14.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-api-1.4.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/xmlenc-0.52.jar:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3-tests.jar:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3.jar:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/zookeeper-3.3.1.jar:/homes/hortonas/hcat/hcat-trunk/build/hcatalog/hcatalog-0.3.0-dev.jar:/homes/hortonas/hcat/hcat-trunk/build/hcatalog/hcatalog-server-extensions-0.3.0-dev.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/testlibs/ant-contrib-1.0b3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/testlibs/junit-4.10.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/lib/javaewah-0.3.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/lib/log4j-1.2.15.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/cli/hive-cli-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/common/hive-common-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/serde/hive-serde-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ql/hive-exec-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/activemq-all-5.5.0.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/commons-cli-1.0.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/hadoop-tools-0.20.205.0.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/jdeb-0.8.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/jms-1.1.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/management-api-1.1-rev-1.jar:/homes/hortonas/hcat/hcat-trunk/build/ivy/lib/hcatalog/pig-0.8.0.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hbase-handler/hive-hbase-handler-0.9.0-SNAPSHOT.jar:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-test-0.20.3-CDH3-SNAPSHOT.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/junit-4.8.1.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant-launcher.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant-junit.jar:/homes/hortonas/softwares/apache-ant-1.8.2/lib/ant-junit4.jar 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/java/jdk1.6.0_05/jre/lib/i386/server:/usr/java/jdk1.6.0_05/jre/lib/i386:/usr/java/jdk1.6.0_05/jre/../lib/i386:/usr/java/packages/lib/i386:/lib:/usr/lib 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:java.compiler= 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:os.arch=i386 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.18-238.1.1.el5.YAHOO.20110221 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:user.name=hortonas 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:user.home=/homes/hortonas 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Client environment:user.dir=/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase 11/11/22 21:37:36 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=master:51911 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:37:37 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48538 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:37:37 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48538 11/11/22 21:37:37 INFO server.NIOServerCnxn: Established session 0x133cd36705c0000 with negotiated timeout 40000 for client /127.0.0.1:48538 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0000, negotiated timeout = 40000 11/11/22 21:37:37 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=Master, sessionId=hrt7n35.cc1.ygridcore.net:51911 - already initialized 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: revision 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: hdfsUser 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: hdfsDate 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: hdfsUrl 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: date 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: hdfsRevision 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: user 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: hdfsVersion 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: url 11/11/22 21:37:37 INFO hbase.metrics: MetricsString added: version 11/11/22 21:37:37 INFO hbase.metrics: new MBeanInfo 11/11/22 21:37:37 INFO hbase.metrics: new MBeanInfo 11/11/22 21:37:37 INFO metrics.MasterMetrics: Initialized 11/11/22 21:37:37 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:37:37 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48539 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:37:37 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48539 11/11/22 21:37:37 INFO server.NIOServerCnxn: Established session 0x133cd36705c0001 with negotiated timeout 40000 for client /127.0.0.1:48539 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0001, negotiated timeout = 40000 11/11/22 21:37:37 INFO ipc.HBaseRpcMetrics: Initializing RPC Metrics with hostName=MiniHBaseCluster$MiniHBaseClusterRegionServer, port=54808 11/11/22 21:37:37 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=regionserver:54808 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:37:37 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48540 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:37:37 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48540 11/11/22 21:37:37 INFO master.ActiveMasterManager: Master=hrt7n35.cc1.ygridcore.net:51911 11/11/22 21:37:37 INFO server.NIOServerCnxn: Established session 0x133cd36705c0002 with negotiated timeout 40000 for client /127.0.0.1:48540 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0002, negotiated timeout = 40000 11/11/22 21:37:37 INFO master.MasterFileSystem: BOOTSTRAP: creating ROOT and first META regions 11/11/22 21:37:37 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:37:37 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:37:37 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/.logs/hlog.1321997857440 11/11/22 21:37:37 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@3bc1a1, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:37:37 INFO regionserver.HRegion: Onlined -ROOT-,,0.70236052; next sequenceid=1 11/11/22 21:37:37 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:37:37 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:37:37 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/.META./1028785192/.logs/hlog.1321997857548 11/11/22 21:37:37 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@1b2b131, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:37:37 INFO regionserver.HRegion: Onlined .META.,,1.1028785192; next sequenceid=1 11/11/22 21:37:37 INFO regionserver.Store: Renaming flushed file at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/.tmp/7430200150957887379 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/info/8627051078980553821 11/11/22 21:37:37 INFO regionserver.Store: Added file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/info/8627051078980553821, entries=1, sequenceid=2, memsize=440.0, filesize=706.0 11/11/22 21:37:37 INFO regionserver.HRegion: Closed -ROOT-,,0.70236052 11/11/22 21:37:37 INFO wal.HLog: Master:0;hrt7n35.cc1.ygridcore.net:51911.logSyncer exiting 11/11/22 21:37:37 INFO regionserver.HRegion: Closed .META.,,1.1028785192 11/11/22 21:37:37 INFO wal.HLog: Master:0;hrt7n35.cc1.ygridcore.net:51911.logSyncer exiting 11/11/22 21:37:37 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:37:37 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48541 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:37:37 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48541 11/11/22 21:37:37 INFO server.NIOServerCnxn: Established session 0x133cd36705c0003 with negotiated timeout 40000 for client /127.0.0.1:48541 11/11/22 21:37:37 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0003, negotiated timeout = 40000 11/11/22 21:37:37 INFO master.HMaster: Server active/primary master; hrt7n35.cc1.ygridcore.net:51911, sessionid=0x133cd36705c0000, cluster-up flag was=false 11/11/22 21:37:37 INFO regionserver.MemStoreFlusher: globalMemStoreLimit=197.2m, globalMemStoreLimitLowMark=172.6m, maxHeap=493.1m 11/11/22 21:37:37 INFO regionserver.HRegionServer: Runs every 10000000ms 11/11/22 21:37:37 INFO regionserver.HRegionServer: Attempting connect to Master server at hrt7n35.cc1.ygridcore.net:51911 11/11/22 21:37:37 INFO regionserver.HRegionServer: Connected to master at hrt7n35.cc1.ygridcore.net:51911 11/11/22 21:37:37 INFO regionserver.HRegionServer: Telling master at hrt7n35.cc1.ygridcore.net:51911 that we are up 11/11/22 21:37:37 INFO master.ServerManager: Registering server=hrt7n35.cc1.ygridcore.net,54808,1321997857252, regionCount=0, userLoad=false 11/11/22 21:37:37 INFO regionserver.HRegionServer: Master passed us address to use. Was=hrt7n35.cc1.ygridcore.net:54808, Now=hrt7n35.cc1.ygridcore.net:54808 11/11/22 21:37:37 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:37:37 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:37:37 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/.logs/hrt7n35.cc1.ygridcore.net,54808,1321997857252/hrt7n35.cc1.ygridcore.net%3A54808.1321997857848 11/11/22 21:37:37 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@9d5793, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:37:37 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=RegionServer, sessionId=RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252 - already initialized 11/11/22 21:37:37 INFO hbase.metrics: new MBeanInfo 11/11/22 21:37:37 INFO metrics.RegionServerMetrics: Initialized 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server Responder: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server listener on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 0 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 1 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 2 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 3 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 4 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 5 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 6 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 7 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 8 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: IPC Server handler 9 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 0 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 1 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 2 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 3 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 4 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 5 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 6 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 7 on 54808: starting 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 8 on 54808: starting 11/11/22 21:37:37 INFO regionserver.HRegionServer: Serving as hrt7n35.cc1.ygridcore.net,54808,1321997857252, RPC listening on /98.137.233.155:54808, sessionid=0x133cd36705c0002 11/11/22 21:37:37 INFO regionserver.StoreFile: Allocating LruBlockCache with maximum size 98.6m 11/11/22 21:37:37 INFO ipc.HBaseServer: PRI IPC Server handler 9 on 54808: starting 11/11/22 21:37:38 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:37:38 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:37:38 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48543 11/11/22 21:37:38 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:37:38 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48543 11/11/22 21:37:38 INFO server.NIOServerCnxn: Established session 0x133cd36705c0004 with negotiated timeout 40000 for client /127.0.0.1:48543 11/11/22 21:37:38 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0004, negotiated timeout = 40000 11/11/22 21:37:39 INFO master.ServerManager: Waiting on regionserver(s) count to settle; currently=1 11/11/22 21:37:40 INFO master.ServerManager: Waiting on regionserver(s) count to settle; currently=1 11/11/22 21:37:42 INFO master.ServerManager: Finished waiting for regionserver count to settle; count=1, sleptFor=4500 11/11/22 21:37:42 INFO master.ServerManager: Exiting wait on regionserver(s) to checkin; count=1, stopped=false, count of regions out on cluster=0 11/11/22 21:37:42 INFO master.MasterFileSystem: Log folder file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/.logs/hrt7n35.cc1.ygridcore.net,54808,1321997857252 belongs to an existing region server 11/11/22 21:37:43 INFO catalog.RootLocationEditor: Unsetting ROOT region location in ZooKeeper 11/11/22 21:37:43 INFO server.PrepRequestProcessor: Got user-level KeeperException when processing sessionid:0x133cd36705c0000 type:delete cxid:0x15 zxid:0xfffffffffffffffe txntype:unknown reqpath:n/a Error Path:/hbase/root-region-server Error:KeeperErrorCode = NoNode for /hbase/root-region-server 11/11/22 21:37:43 INFO regionserver.HRegionServer: Received request to open region: -ROOT-,,0.70236052 11/11/22 21:37:43 INFO regionserver.HRegion: Onlined -ROOT-,,0.70236052; next sequenceid=3 11/11/22 21:37:43 INFO catalog.RootLocationEditor: Setting ROOT region location in ZooKeeper as hrt7n35.cc1.ygridcore.net:54808 11/11/22 21:37:43 INFO master.HMaster: -ROOT- assigned=1, rit=false, location=hrt7n35.cc1.ygridcore.net:54808 11/11/22 21:37:43 INFO regionserver.HRegionServer: Received request to open region: .META.,,1.1028785192 11/11/22 21:37:43 INFO regionserver.HRegion: Onlined .META.,,1.1028785192; next sequenceid=1 11/11/22 21:37:43 INFO catalog.MetaEditor: Updated row .META.,,1.1028785192 in region -ROOT-,,0 with server=hrt7n35.cc1.ygridcore.net:54808, startcode=1321997857252 11/11/22 21:37:43 INFO zookeeper.MetaNodeTracker: Detected completed assignment of META, notifying catalog tracker 11/11/22 21:37:43 INFO zookeeper.MetaNodeTracker: Detected completed assignment of META, notifying catalog tracker 11/11/22 21:37:43 INFO master.HMaster: .META. assigned=2, rit=false, location=hrt7n35.cc1.ygridcore.net:54808 11/11/22 21:37:43 INFO master.HMaster: Master startup proceeding: cluster startup 11/11/22 21:37:43 INFO master.HMaster: Master has completed initialization 11/11/22 21:37:44 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 11/11/22 21:37:44 INFO metastore.ObjectStore: ObjectStore, initialize called 11/11/22 21:37:44 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved. 11/11/22 21:37:44 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved. 11/11/22 21:37:44 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved. 11/11/22 21:37:44 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored 11/11/22 21:37:44 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored 11/11/22 21:37:44 INFO DataNucleus.Persistence: ================= Persistence Configuration =============== 11/11/22 21:37:44 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3" 11/11/22 21:37:44 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:derby:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/metastore_db;create=true" driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP" 11/11/22 21:37:44 INFO DataNucleus.Persistence: =========================================================== 11/11/22 21:37:46 INFO Datastore.Schema: Initialising Catalog "", Schema "APP" using "None" auto-start option 11/11/22 21:37:46 INFO Datastore.Schema: Catalog "", Schema "APP" initialised - managing 0 classes 11/11/22 21:37:46 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 11/11/22 21:37:46 INFO DataNucleus.MetaData: Registering listener for metadata initialisation 11/11/22 21:37:46 INFO metastore.ObjectStore: Initialized ObjectStore 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 321, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 368, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 390, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 425, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 462, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 503, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 544, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 585, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 630, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 675, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/metastore/hive-metastore-0.9.0-SNAPSHOT.jar!/package.jdo" at line 703, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 11/11/22 21:37:47 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS, InheritanceStrategy : new-table] 11/11/22 21:37:47 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : DATABASE_PARAMS] 11/11/22 21:37:47 INFO Datastore.Schema: Creating table DBS 11/11/22 21:37:47 INFO Datastore.Schema: Creating table DATABASE_PARAMS 11/11/22 21:37:47 INFO Datastore.Schema: Creating index "UNIQUE_DATABASE" in catalog "" schema "" 11/11/22 21:37:47 INFO Datastore.Schema: Creating foreign key constraint : "DATABASE_PARAMS_FK1" in catalog "" schema "" 11/11/22 21:37:47 INFO Datastore.Schema: Creating index "DATABASE_PARAMS_N49" in catalog "" schema "" 11/11/22 21:37:47 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213747550. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213747550. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:243) at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:389) at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:408) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:473) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.access$300(HiveMetaStore.java:136) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$5.run(HiveMetaStore.java:495) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$5.run(HiveMetaStore.java:492) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:492) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:260) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStore.java:223) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:109) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:93) at org.apache.hcatalog.hbase.ManyMiniCluster.setUpMetastore(ManyMiniCluster.java:295) at org.apache.hcatalog.hbase.ManyMiniCluster.start(ManyMiniCluster.java:115) at org.apache.hcatalog.hbase.SkeletonHBaseTest$Context.start(SkeletonHBaseTest.java:177) at org.apache.hcatalog.hbase.SkeletonHBaseTest.setup(SkeletonHBaseTest.java:87) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:37:47 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase 11/11/22 21:37:47 INFO Datastore.Schema: Creating table SEQUENCE_TABLE Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_540522016.txt 11/11/22 21:37:47 INFO exec.HiveHistory: Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_540522016.txt 11/11/22 21:37:48 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:37:48 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:37:48 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:37:48 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48546 11/11/22 21:37:48 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48546 11/11/22 21:37:48 INFO server.NIOServerCnxn: Established session 0x133cd36705c0005 with negotiated timeout 40000 for client /127.0.0.1:48546 11/11/22 21:37:48 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0005, negotiated timeout = 40000 11/11/22 21:37:48 WARN zookeeper.ZKTable: Moving table hbasebulkoutputformattest_74574805078408209 state to enabled but was already enabled 11/11/22 21:37:48 INFO server.PrepRequestProcessor: Got user-level KeeperException when processing sessionid:0x133cd36705c0000 type:delete cxid:0x34 zxid:0xfffffffffffffffe txntype:unknown reqpath:n/a Error Path:/hbase/table/hbasebulkoutputformattest_74574805078408209 Error:KeeperErrorCode = NoNode for /hbase/table/hbasebulkoutputformattest_74574805078408209 11/11/22 21:37:48 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:37:48 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:37:48 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputformattest_74574805078408209/18c77e39c9e413bede9dd808dcb69bae/.logs/hlog.1321997868131 11/11/22 21:37:48 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@8c1852, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:37:48 INFO regionserver.HRegion: Onlined hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae.; next sequenceid=1 11/11/22 21:37:48 INFO catalog.MetaEditor: Added region hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae. to META 11/11/22 21:37:48 INFO regionserver.HRegion: Closed hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae. 11/11/22 21:37:48 INFO wal.HLog: IPC Server handler 9 on 51911.logSyncer exiting 11/11/22 21:37:48 INFO master.AssignmentManager: Bulk assigning 1 region(s) round-robin across 1 server(s) 11/11/22 21:37:48 INFO master.AssignmentManager: Bulk assigning done 11/11/22 21:37:48 INFO master.AssignmentManager: hrt7n35.cc1.ygridcore.net,54808,1321997857252 unassigned znodes=1 of total=1 11/11/22 21:37:48 INFO regionserver.HRegionServer: Received request to open 1 region(s) 11/11/22 21:37:48 INFO regionserver.HRegionServer: Received request to open region: hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae. 11/11/22 21:37:48 INFO regionserver.HRegion: Onlined hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae.; next sequenceid=1 11/11/22 21:37:48 INFO catalog.MetaEditor: Updated row hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae. in region .META.,,1 with server=hrt7n35.cc1.ygridcore.net:54808, startcode=1321997857252 11/11/22 21:37:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/11/22 21:37:49 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/11/22 21:37:49 INFO input.FileInputFormat: Total input paths to process : 1 11/11/22 21:37:49 INFO mapred.JobTracker: Job job_20111122213733305_0001 added successfully for user 'hortonas' to queue 'default' 11/11/22 21:37:49 INFO mapred.JobTracker: Initializing job_20111122213733305_0001 11/11/22 21:37:49 INFO mapred.JobInProgress: Initializing job_20111122213733305_0001 11/11/22 21:37:49 INFO mapred.AuditLogger: USER=hortonas IP=127.0.0.1 OPERATION=SUBMIT_JOB TARGET=job_20111122213733305_0001 RESULT=SUCCESS 11/11/22 21:37:49 INFO mapred.JobClient: Running job: job_20111122213733305_0001 11/11/22 21:37:50 INFO mapred.JobClient: map 0% reduce 0% 11/11/22 21:37:50 INFO mapred.JobInProgress: jobToken generated and stored with users keys in /tmp/hadoop-hortonas/mapred/system/job_20111122213733305_0001/jobToken 11/11/22 21:37:50 INFO mapred.JobInProgress: Input size for job job_20111122213733305_0001 = 81. Number of splits = 1 11/11/22 21:37:50 INFO net.NetworkTopology: Adding a new node: /default-rack/localhost 11/11/22 21:37:50 INFO mapred.JobInProgress: tip:task_20111122213733305_0001_m_000000 has split on node:/default-rack/localhost 11/11/22 21:37:50 INFO mapred.JobInProgress: Job job_20111122213733305_0001 initialized successfully with 1 map tasks and 0 reduce tasks. 11/11/22 21:37:50 INFO mapred.JobTracker: Adding task (JOB_SETUP) 'attempt_20111122213733305_0001_m_000002_0' to tip task_20111122213733305_0001_m_000002, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:37:50 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0001_m_000002_0 task's state:UNASSIGNED 11/11/22 21:37:50 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0001_m_000002_0 which needs 1 slots 11/11/22 21:37:50 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0001_m_000002_0 which needs 1 slots 11/11/22 21:37:50 INFO tasktracker.Localizer: Initializing user hortonas on this TT. 11/11/22 21:37:50 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0001_m_-1278118196 11/11/22 21:37:50 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0001_m_-1278118196 spawned. 11/11/22 21:37:51 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0001_m_-1278118196 given task: attempt_20111122213733305_0001_m_000002_0 11/11/22 21:37:51 INFO mapred.TaskTracker: attempt_20111122213733305_0001_m_000002_0 0.0% setup 11/11/22 21:37:51 INFO mapred.TaskTracker: Task attempt_20111122213733305_0001_m_000002_0 is done. 11/11/22 21:37:51 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0001_m_000002_0 was -1 11/11/22 21:37:51 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:37:51 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11367: No such process 11/11/22 21:37:51 INFO util.ProcessTree: Killing all processes in the process group 11367 with SIGTERM. Exit code 1 11/11/22 21:37:53 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0001_m_000002_0' has completed task_20111122213733305_0001_m_000002 successfully. 11/11/22 21:37:53 INFO mapred.JobTracker: Adding task (MAP) 'attempt_20111122213733305_0001_m_000000_0' to tip task_20111122213733305_0001_m_000000, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:37:53 INFO mapred.JobInProgress: Choosing rack-local task task_20111122213733305_0001_m_000000 11/11/22 21:37:53 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0001_m_000000_0 task's state:UNASSIGNED 11/11/22 21:37:53 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0001_m_000000_0 which needs 1 slots 11/11/22 21:37:53 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0001_m_000000_0 which needs 1 slots 11/11/22 21:37:53 INFO mapred.TaskTracker: Received KillTaskAction for task: attempt_20111122213733305_0001_m_000002_0 11/11/22 21:37:53 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:37:53 INFO mapred.TaskTracker: About to purge task: attempt_20111122213733305_0001_m_000002_0 11/11/22 21:37:53 INFO mapred.TaskRunner: attempt_20111122213733305_0001_m_000002_0 done; removing files. 11/11/22 21:37:53 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0001_m_000002_0 not found in cache 11/11/22 21:37:53 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0001_m_1196299132 11/11/22 21:37:53 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0001_m_1196299132 spawned. 11/11/22 21:37:54 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0001_m_1196299132 given task: attempt_20111122213733305_0001_m_000000_0 11/11/22 21:37:54 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0001_m_000002_0, Status : SUCCEEDED 11/11/22 21:37:54 INFO mapred.TaskTracker: Task attempt_20111122213733305_0001_m_000000_0 is in commit-pending, task state:COMMIT_PENDING 11/11/22 21:37:54 INFO mapred.TaskTracker: attempt_20111122213733305_0001_m_000000_0 0.0% 11/11/22 21:37:56 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0001_m_-1278118196 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:37:56 INFO mapred.TaskTracker: Received commit task action for attempt_20111122213733305_0001_m_000000_0 11/11/22 21:37:57 INFO mapred.TaskTracker: attempt_20111122213733305_0001_m_000000_0 1.0% 11/11/22 21:37:57 INFO mapred.TaskTracker: Task attempt_20111122213733305_0001_m_000000_0 is done. 11/11/22 21:37:57 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0001_m_000000_0 was -1 11/11/22 21:37:57 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:37:57 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11407: No such process 11/11/22 21:37:57 INFO util.ProcessTree: Killing all processes in the process group 11407 with SIGTERM. Exit code 1 11/11/22 21:37:59 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0001_m_000000_0' has completed task_20111122213733305_0001_m_000000 successfully. 11/11/22 21:37:59 INFO mapred.JobTracker: Adding task (JOB_CLEANUP) 'attempt_20111122213733305_0001_m_000001_0' to tip task_20111122213733305_0001_m_000001, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:37:59 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0001_m_000001_0 task's state:UNASSIGNED 11/11/22 21:37:59 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0001_m_000001_0 which needs 1 slots 11/11/22 21:37:59 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0001_m_000001_0 which needs 1 slots 11/11/22 21:37:59 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:37:59 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0001_m_-1456876460 11/11/22 21:37:59 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0001_m_-1456876460 spawned. 11/11/22 21:38:00 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0001_m_-1456876460 given task: attempt_20111122213733305_0001_m_000001_0 11/11/22 21:38:00 INFO mapred.TaskTracker: attempt_20111122213733305_0001_m_000001_0 0.0% 11/11/22 21:38:00 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0001_m_000000_0, Status : SUCCEEDED 11/11/22 21:38:00 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48556 11/11/22 21:38:00 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48556 11/11/22 21:38:00 INFO server.NIOServerCnxn: Established session 0x133cd36705c0006 with negotiated timeout 40000 for client /127.0.0.1:48556 11/11/22 21:38:01 INFO mapred.JobClient: map 100% reduce 0% 11/11/22 21:38:02 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0001_m_1196299132 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:02 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48558 11/11/22 21:38:02 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48558 11/11/22 21:38:02 INFO server.NIOServerCnxn: Established session 0x133cd36705c0007 with negotiated timeout 40000 for client /127.0.0.1:48558 11/11/22 21:38:02 INFO regionserver.Store: Validating hfile at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputFormatTest/inter_hfiles/my_family/4288652363622761699 for inclusion in store my_family region hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae. 11/11/22 21:38:03 INFO regionserver.Store: Renaming bulk load file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputFormatTest/inter_hfiles/my_family/4288652363622761699 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputformattest_74574805078408209/18c77e39c9e413bede9dd808dcb69bae/my_family/6586561085113950642 11/11/22 21:38:03 INFO regionserver.Store: Moved hfile file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputFormatTest/inter_hfiles/my_family/4288652363622761699 into store directory file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputformattest_74574805078408209/18c77e39c9e413bede9dd808dcb69bae/my_family - updating store file list. 11/11/22 21:38:03 INFO regionserver.Store: Successfully loaded store file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputFormatTest/inter_hfiles/my_family/4288652363622761699 into store my_family (new location: file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputformattest_74574805078408209/18c77e39c9e413bede9dd808dcb69bae/my_family/6586561085113950642) 11/11/22 21:38:03 INFO mapred.TaskTracker: attempt_20111122213733305_0001_m_000001_0 0.0% cleanup 11/11/22 21:38:03 INFO mapred.TaskTracker: Task attempt_20111122213733305_0001_m_000001_0 is done. 11/11/22 21:38:03 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0001_m_000001_0 was -1 11/11/22 21:38:03 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:03 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0006, likely client has closed socket 11/11/22 21:38:03 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48556 which had sessionid 0x133cd36705c0006 11/11/22 21:38:03 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0007, likely client has closed socket 11/11/22 21:38:03 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48558 which had sessionid 0x133cd36705c0007 11/11/22 21:38:03 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11449: No such process 11/11/22 21:38:03 INFO util.ProcessTree: Killing all processes in the process group 11449 with SIGTERM. Exit code 1 11/11/22 21:38:05 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0001_m_000001_0' has completed task_20111122213733305_0001_m_000001 successfully. 11/11/22 21:38:05 INFO mapred.JobInProgress: Job job_20111122213733305_0001 has completed successfully. 11/11/22 21:38:05 INFO mapred.JobInProgress$JobSummary: jobId=job_20111122213733305_0001,submitTime=1321997869444,launchTime=1321997870572,finishTime=1321997885829,numMaps=1,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=hortonas,queue=default,status=SUCCEEDED,mapSlotSeconds=7,reduceSlotsSeconds=0,clusterMapCapacity=2,clusterReduceCapacity=2 11/11/22 21:38:05 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0001_hortonas_hbaseBulkOutputFormatTest to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:38:05 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0001_m_000000_0' 11/11/22 21:38:05 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0001_m_000001_0' 11/11/22 21:38:05 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0001_m_000002_0' 11/11/22 21:38:05 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20111122213733305_0001 11/11/22 21:38:05 INFO mapred.TaskRunner: attempt_20111122213733305_0001_m_000001_0 done; removing files. 11/11/22 21:38:05 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0001_m_000001_0 not found in cache 11/11/22 21:38:05 INFO mapred.TaskRunner: attempt_20111122213733305_0001_m_000000_0 done; removing files. 11/11/22 21:38:05 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0001_m_000000_0 not found in cache 11/11/22 21:38:05 INFO mapred.UserLogCleaner: Adding job_20111122213733305_0001 for user-log deletion with retainTimeStamp:1322084285884 11/11/22 21:38:05 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0001_conf.xml to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:38:06 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0001_m_000001_0, Status : SUCCEEDED 11/11/22 21:38:06 INFO mapred.JobClient: Job complete: job_20111122213733305_0001 11/11/22 21:38:06 INFO mapred.JobClient: Counters: 12 11/11/22 21:38:06 INFO mapred.JobClient: Job Counters 11/11/22 21:38:06 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=7973 11/11/22 21:38:06 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 11/11/22 21:38:06 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 11/11/22 21:38:06 INFO mapred.JobClient: Rack-local map tasks=1 11/11/22 21:38:06 INFO mapred.JobClient: Launched map tasks=1 11/11/22 21:38:06 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 11/11/22 21:38:06 INFO mapred.JobClient: FileSystemCounters 11/11/22 21:38:06 INFO mapred.JobClient: FILE_BYTES_READ=318 11/11/22 21:38:06 INFO mapred.JobClient: FILE_BYTES_WRITTEN=559 11/11/22 21:38:06 INFO mapred.JobClient: Map-Reduce Framework 11/11/22 21:38:06 INFO mapred.JobClient: Map input records=3 11/11/22 21:38:06 INFO mapred.JobClient: Spilled Records=0 11/11/22 21:38:06 INFO mapred.JobClient: Map output records=3 11/11/22 21:38:06 INFO mapred.JobClient: SPLIT_RAW_BYTES=206 11/11/22 21:38:06 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:06 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:06 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48560 11/11/22 21:38:06 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:06 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48560 11/11/22 21:38:06 INFO server.NIOServerCnxn: Established session 0x133cd36705c0008 with negotiated timeout 40000 for client /127.0.0.1:48560 11/11/22 21:38:06 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0008, negotiated timeout = 40000 Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_1145133096.txt 11/11/22 21:38:06 INFO exec.HiveHistory: Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_1145133096.txt 11/11/22 21:38:06 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:06 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:06 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48562 11/11/22 21:38:06 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:06 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48562 11/11/22 21:38:06 INFO server.NIOServerCnxn: Established session 0x133cd36705c0009 with negotiated timeout 40000 for client /127.0.0.1:48562 11/11/22 21:38:06 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0009, negotiated timeout = 40000 11/11/22 21:38:06 WARN zookeeper.ZKTable: Moving table importsequencefiletest_6277872447333260599 state to enabled but was already enabled 11/11/22 21:38:06 INFO server.PrepRequestProcessor: Got user-level KeeperException when processing sessionid:0x133cd36705c0000 type:delete cxid:0x3f zxid:0xfffffffffffffffe txntype:unknown reqpath:n/a Error Path:/hbase/table/importsequencefiletest_6277872447333260599 Error:KeeperErrorCode = NoNode for /hbase/table/importsequencefiletest_6277872447333260599 11/11/22 21:38:06 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:38:06 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:38:06 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/importsequencefiletest_6277872447333260599/870cba3a31103028ca4d6c05066d8928/.logs/hlog.1321997886918 11/11/22 21:38:06 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@1fef1a0, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:38:07 INFO regionserver.HRegion: Onlined importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928.; next sequenceid=1 11/11/22 21:38:07 INFO catalog.MetaEditor: Added region importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928. to META 11/11/22 21:38:07 INFO regionserver.HRegion: Closed importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928. 11/11/22 21:38:07 INFO wal.HLog: IPC Server handler 9 on 51911.logSyncer exiting 11/11/22 21:38:07 INFO master.AssignmentManager: Bulk assigning 1 region(s) round-robin across 1 server(s) 11/11/22 21:38:07 INFO master.AssignmentManager: Bulk assigning done 11/11/22 21:38:07 INFO master.AssignmentManager: hrt7n35.cc1.ygridcore.net,54808,1321997857252 unassigned znodes=1 of total=1 11/11/22 21:38:07 INFO regionserver.HRegionServer: Received request to open 1 region(s) 11/11/22 21:38:07 INFO regionserver.HRegionServer: Received request to open region: importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928. 11/11/22 21:38:07 INFO regionserver.HRegion: Onlined importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928.; next sequenceid=1 11/11/22 21:38:07 INFO catalog.MetaEditor: Updated row importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928. in region .META.,,1 with server=hrt7n35.cc1.ygridcore.net:54808, startcode=1321997857252 11/11/22 21:38:08 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/11/22 21:38:08 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/11/22 21:38:08 INFO input.FileInputFormat: Total input paths to process : 1 11/11/22 21:38:08 INFO mapred.JobTracker: Job job_20111122213733305_0002 added successfully for user 'hortonas' to queue 'default' 11/11/22 21:38:08 INFO mapred.AuditLogger: USER=hortonas IP=127.0.0.1 OPERATION=SUBMIT_JOB TARGET=job_20111122213733305_0002 RESULT=SUCCESS 11/11/22 21:38:08 INFO mapred.JobTracker: Initializing job_20111122213733305_0002 11/11/22 21:38:08 INFO mapred.JobInProgress: Initializing job_20111122213733305_0002 11/11/22 21:38:08 INFO mapred.JobClient: Running job: job_20111122213733305_0002 11/11/22 21:38:08 INFO mapred.JobInProgress: jobToken generated and stored with users keys in /tmp/hadoop-hortonas/mapred/system/job_20111122213733305_0002/jobToken 11/11/22 21:38:08 INFO mapred.JobInProgress: Input size for job job_20111122213733305_0002 = 81. Number of splits = 1 11/11/22 21:38:08 INFO mapred.JobInProgress: tip:task_20111122213733305_0002_m_000000 has split on node:/default-rack/localhost 11/11/22 21:38:08 INFO mapred.JobInProgress: Job job_20111122213733305_0002 initialized successfully with 1 map tasks and 0 reduce tasks. 11/11/22 21:38:08 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0001_m_-1456876460 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:08 INFO mapred.JobTracker: Adding task (JOB_SETUP) 'attempt_20111122213733305_0002_m_000002_0' to tip task_20111122213733305_0002_m_000002, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:08 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0002_m_000002_0 task's state:UNASSIGNED 11/11/22 21:38:08 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0002_m_000002_0 which needs 1 slots 11/11/22 21:38:08 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0002_m_000002_0 which needs 1 slots 11/11/22 21:38:08 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:08 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0002_m_-367802381 11/11/22 21:38:08 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0002_m_-367802381 spawned. 11/11/22 21:38:09 INFO mapred.JobClient: map 0% reduce 0% 11/11/22 21:38:09 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0002_m_-367802381 given task: attempt_20111122213733305_0002_m_000002_0 11/11/22 21:38:09 INFO mapred.TaskTracker: attempt_20111122213733305_0002_m_000002_0 0.0% setup 11/11/22 21:38:09 INFO mapred.TaskTracker: Task attempt_20111122213733305_0002_m_000002_0 is done. 11/11/22 21:38:09 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0002_m_000002_0 was -1 11/11/22 21:38:09 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:09 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11680: No such process 11/11/22 21:38:09 INFO util.ProcessTree: Killing all processes in the process group 11680 with SIGTERM. Exit code 1 11/11/22 21:38:11 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0002_m_000002_0' has completed task_20111122213733305_0002_m_000002 successfully. 11/11/22 21:38:11 INFO mapred.JobTracker: Adding task (MAP) 'attempt_20111122213733305_0002_m_000000_0' to tip task_20111122213733305_0002_m_000000, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:11 INFO mapred.JobInProgress: Choosing rack-local task task_20111122213733305_0002_m_000000 11/11/22 21:38:11 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0002_m_000000_0 task's state:UNASSIGNED 11/11/22 21:38:11 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0002_m_000000_0 which needs 1 slots 11/11/22 21:38:11 INFO mapred.TaskTracker: Received KillTaskAction for task: attempt_20111122213733305_0002_m_000002_0 11/11/22 21:38:11 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0002_m_000000_0 which needs 1 slots 11/11/22 21:38:11 INFO mapred.TaskTracker: About to purge task: attempt_20111122213733305_0002_m_000002_0 11/11/22 21:38:11 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:11 INFO mapred.TaskRunner: attempt_20111122213733305_0002_m_000002_0 done; removing files. 11/11/22 21:38:11 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0002_m_000002_0 not found in cache 11/11/22 21:38:11 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0002_m_861711861 11/11/22 21:38:11 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0002_m_861711861 spawned. 11/11/22 21:38:12 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0002_m_000002_0, Status : SUCCEEDED 11/11/22 21:38:12 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0002_m_861711861 given task: attempt_20111122213733305_0002_m_000000_0 11/11/22 21:38:12 INFO mapred.TaskTracker: Task attempt_20111122213733305_0002_m_000000_0 is in commit-pending, task state:COMMIT_PENDING 11/11/22 21:38:12 INFO mapred.TaskTracker: attempt_20111122213733305_0002_m_000000_0 0.0% 11/11/22 21:38:14 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0002_m_-367802381 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:14 INFO mapred.TaskTracker: Received commit task action for attempt_20111122213733305_0002_m_000000_0 11/11/22 21:38:15 INFO mapred.TaskTracker: attempt_20111122213733305_0002_m_000000_0 1.0% 11/11/22 21:38:15 INFO mapred.TaskTracker: Task attempt_20111122213733305_0002_m_000000_0 is done. 11/11/22 21:38:15 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0002_m_000000_0 was -1 11/11/22 21:38:15 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:15 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11720: No such process 11/11/22 21:38:15 INFO util.ProcessTree: Killing all processes in the process group 11720 with SIGTERM. Exit code 1 11/11/22 21:38:17 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0002_m_000000_0' has completed task_20111122213733305_0002_m_000000 successfully. 11/11/22 21:38:17 INFO mapred.JobTracker: Adding task (JOB_CLEANUP) 'attempt_20111122213733305_0002_m_000001_0' to tip task_20111122213733305_0002_m_000001, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:17 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0002_m_000001_0 task's state:UNASSIGNED 11/11/22 21:38:17 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0002_m_000001_0 which needs 1 slots 11/11/22 21:38:17 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0002_m_000001_0 which needs 1 slots 11/11/22 21:38:17 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:17 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0002_m_-2107535802 11/11/22 21:38:17 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0002_m_-2107535802 spawned. 11/11/22 21:38:18 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0002_m_000000_0, Status : SUCCEEDED 11/11/22 21:38:18 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0002_m_-2107535802 given task: attempt_20111122213733305_0002_m_000001_0 11/11/22 21:38:18 INFO mapred.TaskTracker: attempt_20111122213733305_0002_m_000001_0 0.0% 11/11/22 21:38:18 INFO mapred.TaskTracker: attempt_20111122213733305_0002_m_000001_0 0.0% cleanup 11/11/22 21:38:18 INFO mapred.TaskTracker: Task attempt_20111122213733305_0002_m_000001_0 is done. 11/11/22 21:38:18 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0002_m_000001_0 was -1 11/11/22 21:38:18 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:18 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11761: No such process 11/11/22 21:38:18 INFO util.ProcessTree: Killing all processes in the process group 11761 with SIGTERM. Exit code 1 11/11/22 21:38:19 INFO mapred.JobClient: map 100% reduce 0% 11/11/22 21:38:20 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0002_m_000001_0' has completed task_20111122213733305_0002_m_000001 successfully. 11/11/22 21:38:20 INFO mapred.JobInProgress: Job job_20111122213733305_0002 has completed successfully. 11/11/22 21:38:20 INFO mapred.JobInProgress$JobSummary: jobId=job_20111122213733305_0002,submitTime=1321997888144,launchTime=1321997888319,finishTime=1321997900898,numMaps=1,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=hortonas,queue=default,status=SUCCEEDED,mapSlotSeconds=5,reduceSlotsSeconds=0,clusterMapCapacity=2,clusterReduceCapacity=2 11/11/22 21:38:20 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0002_hortonas_importSequenceFileTest to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:38:20 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0002_m_000000_0' 11/11/22 21:38:20 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0002_m_000001_0' 11/11/22 21:38:20 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0002_m_000002_0' 11/11/22 21:38:20 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20111122213733305_0002 11/11/22 21:38:20 INFO mapred.TaskRunner: attempt_20111122213733305_0002_m_000001_0 done; removing files. 11/11/22 21:38:20 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0002_m_000001_0 not found in cache 11/11/22 21:38:20 INFO mapred.TaskRunner: attempt_20111122213733305_0002_m_000000_0 done; removing files. 11/11/22 21:38:20 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0002_m_000000_0 not found in cache 11/11/22 21:38:20 INFO mapred.UserLogCleaner: Adding job_20111122213733305_0002 for user-log deletion with retainTimeStamp:1322084300926 11/11/22 21:38:20 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0002_conf.xml to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:38:21 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0002_m_861711861 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:21 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0002_m_000001_0, Status : SUCCEEDED 11/11/22 21:38:21 INFO mapred.JobClient: Job complete: job_20111122213733305_0002 11/11/22 21:38:21 INFO mapred.JobClient: Counters: 12 11/11/22 21:38:21 INFO mapred.JobClient: Job Counters 11/11/22 21:38:21 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=5558 11/11/22 21:38:21 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 11/11/22 21:38:21 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 11/11/22 21:38:21 INFO mapred.JobClient: Rack-local map tasks=1 11/11/22 21:38:21 INFO mapred.JobClient: Launched map tasks=1 11/11/22 21:38:21 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 11/11/22 21:38:21 INFO mapred.JobClient: FileSystemCounters 11/11/22 21:38:21 INFO mapred.JobClient: FILE_BYTES_READ=315 11/11/22 21:38:21 INFO mapred.JobClient: FILE_BYTES_WRITTEN=559 11/11/22 21:38:21 INFO mapred.JobClient: Map-Reduce Framework 11/11/22 21:38:21 INFO mapred.JobClient: Map input records=3 11/11/22 21:38:21 INFO mapred.JobClient: Spilled Records=0 11/11/22 21:38:21 INFO mapred.JobClient: Map output records=3 11/11/22 21:38:21 INFO mapred.JobClient: SPLIT_RAW_BYTES=203 11/11/22 21:38:21 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:21 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:21 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:21 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48571 11/11/22 21:38:21 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48571 11/11/22 21:38:21 INFO server.NIOServerCnxn: Established session 0x133cd36705c000a with negotiated timeout 40000 for client /127.0.0.1:48571 11/11/22 21:38:21 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c000a, negotiated timeout = 40000 11/11/22 21:38:21 INFO mapreduce.HFileOutputFormat: Looking up current regions for table org.apache.hadoop.hbase.client.HTable@8543aa 11/11/22 21:38:21 INFO mapreduce.HFileOutputFormat: Configuring 1 reduce partitions to match current region count 11/11/22 21:38:21 INFO mapreduce.HFileOutputFormat: Writing partition information to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/partitions_1321997901311 11/11/22 21:38:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 11/11/22 21:38:21 INFO compress.CodecPool: Got brand-new compressor 11/11/22 21:38:21 INFO mapreduce.HFileOutputFormat: Incremental table output configured. 11/11/22 21:38:21 WARN mapreduce.TableMapReduceUtil: Could not find jar for class class org.apache.hcatalog.hbase.ImportSequenceFile$ImporterOutputFormat in order to ship it to the cluster. 11/11/22 21:38:21 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized 11/11/22 21:38:21 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/11/22 21:38:22 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/11/22 21:38:22 INFO input.FileInputFormat: Total input paths to process : 1 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Doing chmod on localdir :/tmp/hadoop-hortonas/mapred/local/archive/-3307082728720190433_-551969957_1295458095 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Cached file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/partitions_1321997901311#_partition.lst as /tmp/hadoop-hortonas/mapred/local/archive/-3307082728720190433_-551969957_1295458095/file/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/partitions_1321997901311 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Doing chmod on localdir :/tmp/hadoop-hortonas/mapred/local/archive/1309060678556344954_-651284666_1942215976 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Cached file:///homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/zookeeper-3.3.1.jar as /tmp/hadoop-hortonas/mapred/local/archive/1309060678556344954_-651284666_1942215976/file/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/zookeeper-3.3.1.jar 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Doing chmod on localdir :/tmp/hadoop-hortonas/mapred/local/archive/-5202648438079587957_-1252483710_185263624 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Cached file:///homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3.jar as /tmp/hadoop-hortonas/mapred/local/archive/-5202648438079587957_-1252483710_185263624/file/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3.jar 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Doing chmod on localdir :/tmp/hadoop-hortonas/mapred/local/archive/6599704062199483260_-1671625761_1315065741 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Cached file:///homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar as /tmp/hadoop-hortonas/mapred/local/archive/6599704062199483260_-1671625761_1315065741/file/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar 11/11/22 21:38:22 WARN mapred.LocalJobRunner: LocalJobRunner does not support symlinking into current working dir. 11/11/22 21:38:22 INFO mapred.TaskRunner: Creating symlink: /tmp/hadoop-hortonas/mapred/local/archive/-3307082728720190433_-551969957_1295458095/file/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/partitions_1321997901311 <- /tmp/hadoop-hortonas/mapred/local/localRunner/_partition.lst 11/11/22 21:38:22 INFO mapred.JobClient: Running job: job_local_0001 11/11/22 21:38:22 INFO mapred.MapTask: io.sort.mb = 100 11/11/22 21:38:22 INFO mapred.MapTask: data buffer = 79691776/99614720 11/11/22 21:38:22 INFO mapred.MapTask: record buffer = 262144/327680 11/11/22 21:38:22 INFO compress.CodecPool: Got brand-new decompressor 11/11/22 21:38:22 INFO mapred.MapTask: Starting flush of map output 11/11/22 21:38:22 INFO mapred.MapTask: Finished spill 0 11/11/22 21:38:22 INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting 11/11/22 21:38:22 INFO mapred.LocalJobRunner: 11/11/22 21:38:22 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done. 11/11/22 21:38:22 INFO mapred.LocalJobRunner: 11/11/22 21:38:22 INFO mapred.Merger: Merging 1 sorted segments 11/11/22 21:38:22 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 419 bytes 11/11/22 21:38:22 INFO mapred.LocalJobRunner: 11/11/22 21:38:22 INFO mapreduce.HFileOutputFormat: Writer=file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/_temporary/_attempt_local_0001_r_000000_0/my_family/4591864366852889157 11/11/22 21:38:22 INFO mapred.Task: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting 11/11/22 21:38:22 INFO mapred.LocalJobRunner: 11/11/22 21:38:22 INFO mapred.Task: Task attempt_local_0001_r_000000_0 is allowed to commit now 11/11/22 21:38:22 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch 11/11/22 21:38:22 INFO mapred.LocalJobRunner: Read class java.util.TreeSet > reduce 11/11/22 21:38:22 INFO mapred.Task: Task 'attempt_local_0001_r_000000_0' done. 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Deleted path /tmp/hadoop-hortonas/mapred/local/archive/6599704062199483260_-1671625761_1315065741/file/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/hadoop-core-0.20.3-CDH3-SNAPSHOT.jar 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Deleted path /tmp/hadoop-hortonas/mapred/local/archive/1309060678556344954_-651284666_1942215976/file/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/zookeeper-3.3.1.jar 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Deleted path /tmp/hadoop-hortonas/mapred/local/archive/-5202648438079587957_-1252483710_185263624/file/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/build/ivy/lib/hbase-storage-driver/hbase-0.90.3.jar 11/11/22 21:38:22 INFO filecache.TrackerDistributedCacheManager: Deleted path /tmp/hadoop-hortonas/mapred/local/archive/-3307082728720190433_-551969957_1295458095/file/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/partitions_1321997901311 11/11/22 21:38:23 INFO mapred.JobClient: map 100% reduce 100% 11/11/22 21:38:23 INFO mapred.JobClient: Job complete: job_local_0001 11/11/22 21:38:23 INFO mapred.JobClient: Counters: 13 11/11/22 21:38:23 INFO mapred.JobClient: FileSystemCounters 11/11/22 21:38:23 INFO mapred.JobClient: FILE_BYTES_READ=13033865 11/11/22 21:38:23 INFO mapred.JobClient: FILE_BYTES_WRITTEN=14169340 11/11/22 21:38:23 INFO mapred.JobClient: Map-Reduce Framework 11/11/22 21:38:23 INFO mapred.JobClient: Reduce input groups=3 11/11/22 21:38:23 INFO mapred.JobClient: Combine output records=0 11/11/22 21:38:23 INFO mapred.JobClient: Map input records=3 11/11/22 21:38:23 INFO mapred.JobClient: Reduce shuffle bytes=0 11/11/22 21:38:23 INFO mapred.JobClient: Reduce output records=6 11/11/22 21:38:23 INFO mapred.JobClient: Spilled Records=6 11/11/22 21:38:23 INFO mapred.JobClient: Map output bytes=408 11/11/22 21:38:23 INFO mapred.JobClient: Combine input records=0 11/11/22 21:38:23 INFO mapred.JobClient: Map output records=3 11/11/22 21:38:23 INFO mapred.JobClient: SPLIT_RAW_BYTES=199 11/11/22 21:38:23 INFO mapred.JobClient: Reduce input records=3 11/11/22 21:38:23 WARN output.FileOutputCommitter: Output path is null in cleanup 11/11/22 21:38:23 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:23 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:23 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:23 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48573 11/11/22 21:38:23 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48573 11/11/22 21:38:23 INFO server.NIOServerCnxn: Established session 0x133cd36705c000b with negotiated timeout 40000 for client /127.0.0.1:48573 11/11/22 21:38:23 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c000b, negotiated timeout = 40000 11/11/22 21:38:23 WARN mapreduce.LoadIncrementalHFiles: Skipping non-directory file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/_SUCCESS 11/11/22 21:38:23 INFO mapreduce.LoadIncrementalHFiles: Trying to load hfile=file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/my_family/4591864366852889157 first=1 last=3 11/11/22 21:38:23 INFO regionserver.Store: Validating hfile at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/my_family/4591864366852889157 for inclusion in store my_family region importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928. 11/11/22 21:38:23 INFO regionserver.Store: Renaming bulk load file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/my_family/4591864366852889157 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/importsequencefiletest_6277872447333260599/870cba3a31103028ca4d6c05066d8928/my_family/276837173053007967 11/11/22 21:38:23 INFO regionserver.Store: Moved hfile file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/my_family/4591864366852889157 into store directory file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/importsequencefiletest_6277872447333260599/870cba3a31103028ca4d6c05066d8928/my_family - updating store file list. 11/11/22 21:38:23 INFO regionserver.Store: Successfully loaded store file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/importSequenceFileTest/scratch/my_family/4591864366852889157 into store my_family (new location: file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/importsequencefiletest_6277872447333260599/870cba3a31103028ca4d6c05066d8928/my_family/276837173053007967) 11/11/22 21:38:23 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:23 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:23 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:23 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48574 11/11/22 21:38:23 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48574 11/11/22 21:38:23 INFO server.NIOServerCnxn: Established session 0x133cd36705c000c with negotiated timeout 40000 for client /127.0.0.1:48574 11/11/22 21:38:23 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c000c, negotiated timeout = 40000 Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_220368465.txt 11/11/22 21:38:23 INFO exec.HiveHistory: Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_220368465.txt 11/11/22 21:38:23 INFO ql.Driver: 11/11/22 21:38:23 INFO parse.ParseDriver: Parsing command: CREATE DATABASE IF NOT EXISTS hbasebulkoutputstoragedrivertest LOCATION '/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest' 11/11/22 21:38:23 INFO parse.ParseDriver: Parse Completed 11/11/22 21:38:23 INFO metastore.HiveMetaStore: 0: get_databases: hbasebulkoutputstoragedrivertest 11/11/22 21:38:23 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 11/11/22 21:38:23 INFO metastore.ObjectStore: ObjectStore, initialize called 11/11/22 21:38:23 INFO metastore.ObjectStore: Initialized ObjectStore 11/11/22 21:38:23 INFO ql.Driver: Semantic Analysis Completed 11/11/22 21:38:23 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null) 11/11/22 21:38:23 INFO ql.Driver: 11/11/22 21:38:23 INFO ql.Driver: 11/11/22 21:38:23 INFO ql.Driver: Starting command: CREATE DATABASE IF NOT EXISTS hbasebulkoutputstoragedrivertest LOCATION '/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest' 11/11/22 21:38:23 INFO metastore.HiveMetaStore: 0: create_database: hbasebulkoutputstoragedrivertest /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest null 11/11/22 21:38:23 INFO metastore.HiveMetaStore: 0: get_database: hbasebulkoutputstoragedrivertest 11/11/22 21:38:23 INFO ql.Driver: OK 11/11/22 21:38:23 INFO ql.Driver: OK 11/11/22 21:38:23 INFO ql.Driver: 11/11/22 21:38:23 INFO ql.Driver: 11/11/22 21:38:23 INFO ql.Driver: 11/11/22 21:38:23 INFO parse.ParseDriver: Parsing command: CREATE TABLE hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921(key int, english string, spanish string) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler'TBLPROPERTIES ('hcat.isd'='org.apache.hcatalog.hbase.HBaseInputStorageDriver', 'hcat.osd'='org.apache.hcatalog.hbase.HBaseOutputStorageDriver','hbase.columns.mapping'=':key,my_family:english,my_family:spanish') 11/11/22 21:38:23 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0002_m_-2107535802 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:23 INFO parse.ParseDriver: Parse Completed 11/11/22 21:38:23 INFO parse.SemanticAnalyzer: Starting Semantic Analysis 11/11/22 21:38:23 INFO parse.SemanticAnalyzer: Creating table hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921 position=13 11/11/22 21:38:23 INFO metastore.HiveMetaStore: 0: get_table : db=hbasebulkoutputstoragedrivertest tbl=hbasebulkoutputstoragedrivertest_885980685671171921 11/11/22 21:38:23 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MColumnDescriptor [Table : CDS, InheritanceStrategy : new-table] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : SERDES, InheritanceStrategy : new-table] 11/11/22 21:38:23 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : SDS, InheritanceStrategy : new-table] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : TBLS, InheritanceStrategy : new-table] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : SERDE_PARAMS] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : TABLE_PARAMS] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : PARTITION_KEYS] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : BUCKETING_COLS] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : SD_PARAMS] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : SORT_COLS] 11/11/22 21:38:23 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MColumnDescriptor.cols [Table : COLUMNS_V2] 11/11/22 21:38:24 INFO Datastore.Schema: Creating table SERDES 11/11/22 21:38:24 INFO Datastore.Schema: Creating table TBLS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table SDS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table CDS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table COLUMNS_V2 11/11/22 21:38:24 INFO Datastore.Schema: Creating table PARTITION_KEYS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table SORT_COLS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table SERDE_PARAMS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table SD_PARAMS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table TABLE_PARAMS 11/11/22 21:38:24 INFO Datastore.Schema: Creating table BUCKETING_COLS 11/11/22 21:38:24 INFO Datastore.Schema: Creating foreign key constraint : "TBLS_FK2" in catalog "" schema "" 11/11/22 21:38:24 INFO Datastore.Schema: Creating foreign key constraint : "TBLS_FK1" in catalog "" schema "" 11/11/22 21:38:24 INFO Datastore.Schema: Creating index "TBLS_N50" in catalog "" schema "" 11/11/22 21:38:24 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213824630. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213824630. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:24 INFO Datastore.Schema: Creating index "UniqueTable" in catalog "" schema "" 11/11/22 21:38:24 INFO Datastore.Schema: Creating index "TBLS_N49" in catalog "" schema "" 11/11/22 21:38:24 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213824660. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213824660. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:24 INFO Datastore.Schema: Creating foreign key constraint : "SDS_FK1" in catalog "" schema "" 11/11/22 21:38:24 INFO Datastore.Schema: Creating foreign key constraint : "SDS_FK2" in catalog "" schema "" 11/11/22 21:38:24 INFO Datastore.Schema: Creating index "SDS_N50" in catalog "" schema "" 11/11/22 21:38:24 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213824910. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213824910. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:24 INFO Datastore.Schema: Creating index "SDS_N49" in catalog "" schema "" 11/11/22 21:38:24 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213824880. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213824880. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:24 INFO Datastore.Schema: Creating foreign key constraint : "COLUMNS_V2_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "COLUMNS_V2_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213824970. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213824970. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO Datastore.Schema: Creating foreign key constraint : "PARTITION_KEYS_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "PARTITION_KEYS_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213825010. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213825010. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO Datastore.Schema: Creating foreign key constraint : "SORT_COLS_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "SORT_COLS_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213825220. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213825220. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO Datastore.Schema: Creating foreign key constraint : "SERDE_PARAMS_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "SERDE_PARAMS_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213825260. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213825260. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO Datastore.Schema: Creating foreign key constraint : "SD_PARAMS_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "SD_PARAMS_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213825300. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213825300. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO Datastore.Schema: Creating foreign key constraint : "TABLE_PARAMS_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "TABLE_PARAMS_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213825340. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213825340. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO Datastore.Schema: Creating foreign key constraint : "BUCKETING_COLS_FK1" in catalog "" schema "" 11/11/22 21:38:25 INFO Datastore.Schema: Creating index "BUCKETING_COLS_N49" in catalog "" schema "" 11/11/22 21:38:25 WARN DataNucleus.Datastore: SQL Warning : The new index is a duplicate of an existing index: SQL111122213825380. java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL111122213825380. at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source) at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source) at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source) at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source) at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264) at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652) at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503) at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148) at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986) at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952) at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919) at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356) at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48) at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332) at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312) at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175) at org.datanucleus.store.query.Query.executeQuery(Query.java:1628) at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499) at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266) at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:799) at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:733) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1181) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 ERROR metadata.Hive: NoSuchObjectException(message:hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921 table not found) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1183) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTest(TestHBaseBulkOutputStorageDriver.java:281) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:25 INFO ql.Driver: Semantic Analysis Completed 11/11/22 21:38:25 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null) 11/11/22 21:38:25 INFO ql.Driver: 11/11/22 21:38:25 INFO ql.Driver: 11/11/22 21:38:25 INFO ql.Driver: Starting command: CREATE TABLE hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921(key int, english string, spanish string) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler'TBLPROPERTIES ('hcat.isd'='org.apache.hcatalog.hbase.HBaseInputStorageDriver', 'hcat.osd'='org.apache.hcatalog.hbase.HBaseOutputStorageDriver','hbase.columns.mapping'=':key,my_family:english,my_family:spanish') 11/11/22 21:38:25 INFO exec.DDLTask: Use StorageHandler-supplied org.apache.hadoop.hive.hbase.HBaseSerDe for table hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921 11/11/22 21:38:25 INFO hive.log: DDL: struct hbasebulkoutputstoragedrivertest_885980685671171921 { i32 key, string english, string spanish} 11/11/22 21:38:25 INFO hive.log: DDL: struct hbasebulkoutputstoragedrivertest_885980685671171921 { i32 key, string english, string spanish} 11/11/22 21:38:25 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:25 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:25 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:25 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48575 11/11/22 21:38:25 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48575 11/11/22 21:38:25 INFO server.NIOServerCnxn: Established session 0x133cd36705c000d with negotiated timeout 40000 for client /127.0.0.1:48575 11/11/22 21:38:25 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c000d, negotiated timeout = 40000 11/11/22 21:38:25 WARN zookeeper.ZKTable: Moving table hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921 state to enabled but was already enabled 11/11/22 21:38:25 INFO server.PrepRequestProcessor: Got user-level KeeperException when processing sessionid:0x133cd36705c0000 type:delete cxid:0x4a zxid:0xfffffffffffffffe txntype:unknown reqpath:n/a Error Path:/hbase/table/hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921 Error:KeeperErrorCode = NoNode for /hbase/table/hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921 11/11/22 21:38:25 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:38:25 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:38:25 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921/7f155566e4f3b455c051b7d5022120be/.logs/hlog.1321997905619 11/11/22 21:38:25 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@745477, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:38:25 INFO regionserver.HRegion: Onlined hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be.; next sequenceid=1 11/11/22 21:38:25 INFO catalog.MetaEditor: Added region hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be. to META 11/11/22 21:38:25 INFO regionserver.HRegion: Closed hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be. 11/11/22 21:38:25 INFO wal.HLog: IPC Server handler 9 on 51911.logSyncer exiting 11/11/22 21:38:25 INFO master.AssignmentManager: Bulk assigning 1 region(s) round-robin across 1 server(s) 11/11/22 21:38:25 INFO master.AssignmentManager: Bulk assigning done 11/11/22 21:38:25 INFO master.AssignmentManager: hrt7n35.cc1.ygridcore.net,54808,1321997857252 unassigned znodes=1 of total=1 11/11/22 21:38:25 INFO regionserver.HRegionServer: Received request to open 1 region(s) 11/11/22 21:38:25 INFO regionserver.HRegionServer: Received request to open region: hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be. 11/11/22 21:38:25 INFO regionserver.HRegion: Onlined hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be.; next sequenceid=1 11/11/22 21:38:25 INFO catalog.MetaEditor: Updated row hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be. in region .META.,,1 with server=hrt7n35.cc1.ygridcore.net:54808, startcode=1321997857252 11/11/22 21:38:26 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:26 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:26 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:26 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48576 11/11/22 21:38:26 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48576 11/11/22 21:38:26 INFO server.NIOServerCnxn: Established session 0x133cd36705c000e with negotiated timeout 40000 for client /127.0.0.1:48576 11/11/22 21:38:26 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c000e, negotiated timeout = 40000 11/11/22 21:38:26 INFO metastore.HiveMetaStore: 0: create_table: db=hbasebulkoutputstoragedrivertest tbl=hbasebulkoutputstoragedrivertest_885980685671171921 11/11/22 21:38:26 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MColumnDescriptor 11/11/22 21:38:26 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo 11/11/22 21:38:26 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor 11/11/22 21:38:26 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable 11/11/22 21:38:26 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema 11/11/22 21:38:26 INFO ql.Driver: OK 11/11/22 21:38:26 INFO ql.Driver: OK 11/11/22 21:38:26 INFO ql.Driver: 11/11/22 21:38:26 INFO ql.Driver: 11/11/22 21:38:27 INFO metastore.HiveMetaStore: 0: get_table : db=hbasebulkoutputstoragedrivertest tbl=hbasebulkoutputstoragedrivertest_885980685671171921 11/11/22 21:38:27 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 11/11/22 21:38:27 INFO metastore.ObjectStore: ObjectStore, initialize called 11/11/22 21:38:27 INFO metastore.ObjectStore: Initialized ObjectStore 11/11/22 21:38:27 INFO metastore.HiveMetaStore: 0: Shutting down the object store... 11/11/22 21:38:27 INFO metastore.HiveMetaStore: 0: Metastore shutdown complete. 11/11/22 21:38:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/11/22 21:38:27 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/11/22 21:38:27 INFO input.FileInputFormat: Total input paths to process : 1 11/11/22 21:38:27 INFO mapred.JobTracker: Job job_20111122213733305_0003 added successfully for user 'hortonas' to queue 'default' 11/11/22 21:38:27 INFO mapred.AuditLogger: USER=hortonas IP=127.0.0.1 OPERATION=SUBMIT_JOB TARGET=job_20111122213733305_0003 RESULT=SUCCESS 11/11/22 21:38:27 INFO mapred.JobTracker: Initializing job_20111122213733305_0003 11/11/22 21:38:27 INFO mapred.JobInProgress: Initializing job_20111122213733305_0003 11/11/22 21:38:27 INFO mapred.JobClient: Running job: job_20111122213733305_0003 11/11/22 21:38:27 INFO mapred.JobInProgress: jobToken generated and stored with users keys in /tmp/hadoop-hortonas/mapred/system/job_20111122213733305_0003/jobToken 11/11/22 21:38:27 INFO mapred.JobInProgress: Input size for job job_20111122213733305_0003 = 79. Number of splits = 1 11/11/22 21:38:27 INFO mapred.JobInProgress: tip:task_20111122213733305_0003_m_000000 has split on node:/default-rack/localhost 11/11/22 21:38:27 INFO mapred.JobInProgress: Job job_20111122213733305_0003 initialized successfully with 1 map tasks and 0 reduce tasks. 11/11/22 21:38:28 INFO mapred.JobClient: map 0% reduce 0% 11/11/22 21:38:29 INFO mapred.JobTracker: Adding task (JOB_SETUP) 'attempt_20111122213733305_0003_m_000002_0' to tip task_20111122213733305_0003_m_000002, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:29 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0003_m_000002_0 task's state:UNASSIGNED 11/11/22 21:38:29 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0003_m_000002_0 which needs 1 slots 11/11/22 21:38:29 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0003_m_000002_0 which needs 1 slots 11/11/22 21:38:29 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:29 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0003_m_1729409472 11/11/22 21:38:29 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0003_m_1729409472 spawned. 11/11/22 21:38:30 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0003_m_1729409472 given task: attempt_20111122213733305_0003_m_000002_0 11/11/22 21:38:30 INFO mapred.TaskTracker: attempt_20111122213733305_0003_m_000002_0 0.0% setup 11/11/22 21:38:30 INFO mapred.TaskTracker: Task attempt_20111122213733305_0003_m_000002_0 is done. 11/11/22 21:38:30 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0003_m_000002_0 was -1 11/11/22 21:38:30 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:31 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -11985: No such process 11/11/22 21:38:31 INFO util.ProcessTree: Killing all processes in the process group 11985 with SIGTERM. Exit code 1 11/11/22 21:38:32 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0003_m_000002_0' has completed task_20111122213733305_0003_m_000002 successfully. 11/11/22 21:38:32 INFO mapred.JobTracker: Adding task (MAP) 'attempt_20111122213733305_0003_m_000000_0' to tip task_20111122213733305_0003_m_000000, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:32 INFO mapred.JobInProgress: Choosing rack-local task task_20111122213733305_0003_m_000000 11/11/22 21:38:32 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0003_m_000000_0 task's state:UNASSIGNED 11/11/22 21:38:32 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0003_m_000000_0 which needs 1 slots 11/11/22 21:38:32 INFO mapred.TaskTracker: Received KillTaskAction for task: attempt_20111122213733305_0003_m_000002_0 11/11/22 21:38:32 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0003_m_000000_0 which needs 1 slots 11/11/22 21:38:32 INFO mapred.TaskTracker: About to purge task: attempt_20111122213733305_0003_m_000002_0 11/11/22 21:38:32 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:32 INFO mapred.TaskRunner: attempt_20111122213733305_0003_m_000002_0 done; removing files. 11/11/22 21:38:32 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0003_m_000002_0 not found in cache 11/11/22 21:38:32 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0003_m_2123763894 11/11/22 21:38:32 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0003_m_2123763894 spawned. 11/11/22 21:38:33 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0003_m_000002_0, Status : SUCCEEDED attempt_20111122213733305_0003_m_000002_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0003_m_000002_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0003_m_000002_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0003_m_000002_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:38:33 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0003_m_2123763894 given task: attempt_20111122213733305_0003_m_000000_0 11/11/22 21:38:34 INFO mapred.TaskTracker: Task attempt_20111122213733305_0003_m_000000_0 is in commit-pending, task state:COMMIT_PENDING 11/11/22 21:38:34 INFO mapred.TaskTracker: attempt_20111122213733305_0003_m_000000_0 0.0% 11/11/22 21:38:35 INFO mapred.TaskTracker: Received commit task action for attempt_20111122213733305_0003_m_000000_0 11/11/22 21:38:36 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0003_m_1729409472 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:36 INFO mapred.TaskTracker: attempt_20111122213733305_0003_m_000000_0 1.0% 11/11/22 21:38:36 INFO mapred.TaskTracker: Task attempt_20111122213733305_0003_m_000000_0 is done. 11/11/22 21:38:36 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0003_m_000000_0 was -1 11/11/22 21:38:36 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:36 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12025: No such process 11/11/22 21:38:36 INFO util.ProcessTree: Killing all processes in the process group 12025 with SIGTERM. Exit code 1 11/11/22 21:38:38 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0003_m_000000_0' has completed task_20111122213733305_0003_m_000000 successfully. 11/11/22 21:38:38 INFO mapred.JobTracker: Adding task (JOB_CLEANUP) 'attempt_20111122213733305_0003_m_000001_0' to tip task_20111122213733305_0003_m_000001, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:38 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0003_m_000001_0 task's state:UNASSIGNED 11/11/22 21:38:38 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0003_m_000001_0 which needs 1 slots 11/11/22 21:38:38 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0003_m_000001_0 which needs 1 slots 11/11/22 21:38:38 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:39 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0003_m_1069216 11/11/22 21:38:39 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0003_m_1069216 spawned. 11/11/22 21:38:39 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0003_m_000000_0, Status : SUCCEEDED attempt_20111122213733305_0003_m_000000_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0003_m_000000_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0003_m_000000_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0003_m_000000_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:38:39 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0003_m_1069216 given task: attempt_20111122213733305_0003_m_000001_0 11/11/22 21:38:40 INFO mapred.TaskTracker: attempt_20111122213733305_0003_m_000001_0 0.0% 11/11/22 21:38:40 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48586 11/11/22 21:38:40 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48586 11/11/22 21:38:40 INFO server.NIOServerCnxn: Established session 0x133cd36705c000f with negotiated timeout 40000 for client /127.0.0.1:48586 11/11/22 21:38:40 INFO mapred.JobClient: map 100% reduce 0% 11/11/22 21:38:41 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0003_m_2123763894 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:42 INFO server.ZooKeeperServer: Expiring session 0x133cd36705c0006, timeout of 40000ms exceeded 11/11/22 21:38:42 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0006 11/11/22 21:38:42 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48588 11/11/22 21:38:42 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48588 11/11/22 21:38:42 INFO server.NIOServerCnxn: Established session 0x133cd36705c0010 with negotiated timeout 40000 for client /127.0.0.1:48588 11/11/22 21:38:42 INFO regionserver.Store: Validating hfile at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest/hbasebulkoutputstoragedrivertest_885980685671171921/REVISION_1321997907404_hfiles/my_family/5469558269049133027 for inclusion in store my_family region hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be. 11/11/22 21:38:42 INFO regionserver.Store: Renaming bulk load file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest/hbasebulkoutputstoragedrivertest_885980685671171921/REVISION_1321997907404_hfiles/my_family/5469558269049133027 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921/7f155566e4f3b455c051b7d5022120be/my_family/7777111962026986008 11/11/22 21:38:42 INFO regionserver.Store: Moved hfile file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest/hbasebulkoutputstoragedrivertest_885980685671171921/REVISION_1321997907404_hfiles/my_family/5469558269049133027 into store directory file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921/7f155566e4f3b455c051b7d5022120be/my_family - updating store file list. 11/11/22 21:38:42 INFO regionserver.Store: Successfully loaded store file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTest/DB_hbaseBulkOutputStorageDriverTest/hbasebulkoutputstoragedrivertest_885980685671171921/REVISION_1321997907404_hfiles/my_family/5469558269049133027 into store my_family (new location: file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921/7f155566e4f3b455c051b7d5022120be/my_family/7777111962026986008) 11/11/22 21:38:42 INFO mapred.TaskTracker: attempt_20111122213733305_0003_m_000001_0 0.0% cleanup 11/11/22 21:38:42 INFO mapred.TaskTracker: Task attempt_20111122213733305_0003_m_000001_0 is done. 11/11/22 21:38:42 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0003_m_000001_0 was -1 11/11/22 21:38:42 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:42 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c000f, likely client has closed socket 11/11/22 21:38:42 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48586 which had sessionid 0x133cd36705c000f 11/11/22 21:38:42 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0010, likely client has closed socket 11/11/22 21:38:42 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48588 which had sessionid 0x133cd36705c0010 11/11/22 21:38:42 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12066: No such process 11/11/22 21:38:42 INFO util.ProcessTree: Killing all processes in the process group 12066 with SIGTERM. Exit code 1 11/11/22 21:38:44 INFO server.ZooKeeperServer: Expiring session 0x133cd36705c0007, timeout of 40000ms exceeded 11/11/22 21:38:44 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0007 11/11/22 21:38:44 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0003_m_000001_0' has completed task_20111122213733305_0003_m_000001 successfully. 11/11/22 21:38:44 INFO mapred.JobInProgress: Job job_20111122213733305_0003 has completed successfully. 11/11/22 21:38:44 INFO mapred.JobInProgress$JobSummary: jobId=job_20111122213733305_0003,submitTime=1321997907446,launchTime=1321997907710,finishTime=1321997924957,numMaps=1,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=hortonas,queue=default,status=SUCCEEDED,mapSlotSeconds=7,reduceSlotsSeconds=0,clusterMapCapacity=2,clusterReduceCapacity=2 11/11/22 21:38:44 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0003_hortonas_hbaseBulkOutputStorageDriverTest to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:38:44 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0003_m_000000_0' 11/11/22 21:38:44 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0003_m_000001_0' 11/11/22 21:38:44 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0003_m_000002_0' 11/11/22 21:38:44 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20111122213733305_0003 11/11/22 21:38:44 INFO mapred.TaskRunner: attempt_20111122213733305_0003_m_000001_0 done; removing files. 11/11/22 21:38:44 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0003_m_000001_0 not found in cache 11/11/22 21:38:44 INFO mapred.TaskRunner: attempt_20111122213733305_0003_m_000000_0 done; removing files. 11/11/22 21:38:44 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0003_m_000000_0 not found in cache 11/11/22 21:38:44 INFO mapred.UserLogCleaner: Adding job_20111122213733305_0003 for user-log deletion with retainTimeStamp:1322084324986 11/11/22 21:38:45 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0003_conf.xml to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:38:45 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0003_m_000001_0, Status : SUCCEEDED attempt_20111122213733305_0003_m_000001_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0003_m_000001_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0003_m_000001_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0003_m_000001_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:38:45 INFO mapred.JobClient: Job complete: job_20111122213733305_0003 11/11/22 21:38:45 INFO mapred.JobClient: Counters: 12 11/11/22 21:38:45 INFO mapred.JobClient: Job Counters 11/11/22 21:38:45 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=7607 11/11/22 21:38:45 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 11/11/22 21:38:45 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 11/11/22 21:38:45 INFO mapred.JobClient: Rack-local map tasks=1 11/11/22 21:38:45 INFO mapred.JobClient: Launched map tasks=1 11/11/22 21:38:45 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 11/11/22 21:38:45 INFO mapred.JobClient: FileSystemCounters 11/11/22 21:38:45 INFO mapred.JobClient: FILE_BYTES_READ=323 11/11/22 21:38:45 INFO mapred.JobClient: FILE_BYTES_WRITTEN=554 11/11/22 21:38:45 INFO mapred.JobClient: Map-Reduce Framework 11/11/22 21:38:45 INFO mapred.JobClient: Map input records=3 11/11/22 21:38:45 INFO mapred.JobClient: Spilled Records=0 11/11/22 21:38:45 INFO mapred.JobClient: Map output records=3 11/11/22 21:38:45 INFO mapred.JobClient: SPLIT_RAW_BYTES=213 11/11/22 21:38:45 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:45 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:45 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48590 11/11/22 21:38:45 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:45 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48590 11/11/22 21:38:45 INFO server.NIOServerCnxn: Established session 0x133cd36705c0011 with negotiated timeout 40000 for client /127.0.0.1:48590 11/11/22 21:38:45 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0011, negotiated timeout = 40000 Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_963355813.txt 11/11/22 21:38:45 INFO exec.HiveHistory: Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_963355813.txt 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO parse.ParseDriver: Parsing command: CREATE DATABASE IF NOT EXISTS hbasebulkoutputstoragedrivertestwithrevision LOCATION '/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision' 11/11/22 21:38:45 INFO parse.ParseDriver: Parse Completed 11/11/22 21:38:45 INFO metastore.HiveMetaStore: 0: get_databases: hbasebulkoutputstoragedrivertestwithrevision 11/11/22 21:38:45 INFO ql.Driver: Semantic Analysis Completed 11/11/22 21:38:45 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null) 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO ql.Driver: Starting command: CREATE DATABASE IF NOT EXISTS hbasebulkoutputstoragedrivertestwithrevision LOCATION '/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision' 11/11/22 21:38:45 INFO metastore.HiveMetaStore: 0: create_database: hbasebulkoutputstoragedrivertestwithrevision /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision null 11/11/22 21:38:45 INFO metastore.HiveMetaStore: 0: get_database: hbasebulkoutputstoragedrivertestwithrevision 11/11/22 21:38:45 INFO ql.Driver: OK 11/11/22 21:38:45 INFO ql.Driver: OK 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO parse.ParseDriver: Parsing command: CREATE TABLE hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314(key int, english string, spanish string) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler'TBLPROPERTIES ('hcat.isd'='org.apache.hcatalog.hbase.HBaseInputStorageDriver', 'hcat.osd'='org.apache.hcatalog.hbase.HBaseOutputStorageDriver','hbase.columns.mapping'=':key,my_family:english,my_family:spanish') 11/11/22 21:38:45 INFO parse.ParseDriver: Parse Completed 11/11/22 21:38:45 INFO parse.SemanticAnalyzer: Starting Semantic Analysis 11/11/22 21:38:45 INFO parse.SemanticAnalyzer: Creating table hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 position=13 11/11/22 21:38:45 INFO metastore.HiveMetaStore: 0: get_table : db=hbasebulkoutputstoragedrivertestwithrevision tbl=hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 11/11/22 21:38:45 ERROR metadata.Hive: NoSuchObjectException(message:hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 table not found) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1183) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTestWithRevision(TestHBaseBulkOutputStorageDriver.java:384) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:38:45 INFO ql.Driver: Semantic Analysis Completed 11/11/22 21:38:45 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null) 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO ql.Driver: 11/11/22 21:38:45 INFO ql.Driver: Starting command: CREATE TABLE hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314(key int, english string, spanish string) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler'TBLPROPERTIES ('hcat.isd'='org.apache.hcatalog.hbase.HBaseInputStorageDriver', 'hcat.osd'='org.apache.hcatalog.hbase.HBaseOutputStorageDriver','hbase.columns.mapping'=':key,my_family:english,my_family:spanish') 11/11/22 21:38:45 INFO exec.DDLTask: Use StorageHandler-supplied org.apache.hadoop.hive.hbase.HBaseSerDe for table hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 11/11/22 21:38:45 INFO hive.log: DDL: struct hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 { i32 key, string english, string spanish} 11/11/22 21:38:45 INFO hive.log: DDL: struct hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 { i32 key, string english, string spanish} 11/11/22 21:38:45 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:45 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:45 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48592 11/11/22 21:38:45 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:45 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48592 11/11/22 21:38:45 INFO server.NIOServerCnxn: Established session 0x133cd36705c0012 with negotiated timeout 40000 for client /127.0.0.1:48592 11/11/22 21:38:45 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0012, negotiated timeout = 40000 11/11/22 21:38:45 WARN zookeeper.ZKTable: Moving table hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 state to enabled but was already enabled 11/11/22 21:38:45 INFO server.PrepRequestProcessor: Got user-level KeeperException when processing sessionid:0x133cd36705c0000 type:delete cxid:0x55 zxid:0xfffffffffffffffe txntype:unknown reqpath:n/a Error Path:/hbase/table/hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 Error:KeeperErrorCode = NoNode for /hbase/table/hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 11/11/22 21:38:45 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:38:45 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:38:45 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/831faae27c3129e2cc4fca71efd176db/.logs/hlog.1321997925784 11/11/22 21:38:45 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@107eafc, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:38:45 INFO regionserver.HRegion: Onlined hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db.; next sequenceid=1 11/11/22 21:38:45 INFO catalog.MetaEditor: Added region hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db. to META 11/11/22 21:38:45 INFO regionserver.HRegion: Closed hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db. 11/11/22 21:38:45 INFO wal.HLog: IPC Server handler 0 on 51911.logSyncer exiting 11/11/22 21:38:45 INFO master.AssignmentManager: Bulk assigning 1 region(s) round-robin across 1 server(s) 11/11/22 21:38:45 INFO master.AssignmentManager: Bulk assigning done 11/11/22 21:38:45 INFO master.AssignmentManager: hrt7n35.cc1.ygridcore.net,54808,1321997857252 unassigned znodes=1 of total=1 11/11/22 21:38:45 INFO regionserver.HRegionServer: Received request to open 1 region(s) 11/11/22 21:38:45 INFO regionserver.HRegionServer: Received request to open region: hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db. 11/11/22 21:38:45 INFO regionserver.HRegion: Onlined hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db.; next sequenceid=1 11/11/22 21:38:45 INFO catalog.MetaEditor: Updated row hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db. in region .META.,,1 with server=hrt7n35.cc1.ygridcore.net:54808, startcode=1321997857252 11/11/22 21:38:46 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:38:46 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:38:46 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48593 11/11/22 21:38:46 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:38:46 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48593 11/11/22 21:38:46 INFO server.NIOServerCnxn: Established session 0x133cd36705c0013 with negotiated timeout 40000 for client /127.0.0.1:48593 11/11/22 21:38:46 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0013, negotiated timeout = 40000 11/11/22 21:38:46 INFO metastore.HiveMetaStore: 0: create_table: db=hbasebulkoutputstoragedrivertestwithrevision tbl=hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 11/11/22 21:38:46 INFO ql.Driver: OK 11/11/22 21:38:46 INFO ql.Driver: OK 11/11/22 21:38:46 INFO ql.Driver: 11/11/22 21:38:46 INFO ql.Driver: 11/11/22 21:38:47 INFO metastore.HiveMetaStore: 0: get_table : db=hbasebulkoutputstoragedrivertestwithrevision tbl=hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314 11/11/22 21:38:47 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 11/11/22 21:38:47 INFO metastore.ObjectStore: ObjectStore, initialize called 11/11/22 21:38:47 INFO metastore.ObjectStore: Initialized ObjectStore 11/11/22 21:38:47 INFO metastore.HiveMetaStore: 0: Shutting down the object store... 11/11/22 21:38:47 INFO metastore.HiveMetaStore: 0: Metastore shutdown complete. 11/11/22 21:38:47 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/11/22 21:38:47 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/11/22 21:38:47 INFO input.FileInputFormat: Total input paths to process : 1 11/11/22 21:38:47 INFO mapred.JobTracker: Job job_20111122213733305_0004 added successfully for user 'hortonas' to queue 'default' 11/11/22 21:38:47 INFO mapred.AuditLogger: USER=hortonas IP=127.0.0.1 OPERATION=SUBMIT_JOB TARGET=job_20111122213733305_0004 RESULT=SUCCESS 11/11/22 21:38:47 INFO mapred.JobTracker: Initializing job_20111122213733305_0004 11/11/22 21:38:47 INFO mapred.JobInProgress: Initializing job_20111122213733305_0004 11/11/22 21:38:47 INFO mapred.JobClient: Running job: job_20111122213733305_0004 11/11/22 21:38:47 INFO mapred.JobInProgress: jobToken generated and stored with users keys in /tmp/hadoop-hortonas/mapred/system/job_20111122213733305_0004/jobToken 11/11/22 21:38:47 INFO mapred.JobInProgress: Input size for job job_20111122213733305_0004 = 79. Number of splits = 1 11/11/22 21:38:47 INFO mapred.JobInProgress: tip:task_20111122213733305_0004_m_000000 has split on node:/default-rack/localhost 11/11/22 21:38:47 INFO mapred.JobInProgress: Job job_20111122213733305_0004 initialized successfully with 1 map tasks and 0 reduce tasks. 11/11/22 21:38:47 INFO mapred.JobTracker: Adding task (JOB_SETUP) 'attempt_20111122213733305_0004_m_000002_0' to tip task_20111122213733305_0004_m_000002, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:47 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0004_m_000002_0 task's state:UNASSIGNED 11/11/22 21:38:47 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0004_m_000002_0 which needs 1 slots 11/11/22 21:38:47 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0004_m_000002_0 which needs 1 slots 11/11/22 21:38:47 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:47 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0003_m_1069216 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:48 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0004_m_1596677326 11/11/22 21:38:48 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0004_m_1596677326 spawned. 11/11/22 21:38:48 INFO mapred.JobClient: map 0% reduce 0% 11/11/22 21:38:48 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0004_m_1596677326 given task: attempt_20111122213733305_0004_m_000002_0 11/11/22 21:38:49 INFO mapred.TaskTracker: attempt_20111122213733305_0004_m_000002_0 0.0% setup 11/11/22 21:38:49 INFO mapred.TaskTracker: Task attempt_20111122213733305_0004_m_000002_0 is done. 11/11/22 21:38:49 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0004_m_000002_0 was -1 11/11/22 21:38:49 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:49 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12299: No such process 11/11/22 21:38:49 INFO util.ProcessTree: Killing all processes in the process group 12299 with SIGTERM. Exit code 1 11/11/22 21:38:50 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0004_m_000002_0' has completed task_20111122213733305_0004_m_000002 successfully. 11/11/22 21:38:50 INFO mapred.JobTracker: Adding task (MAP) 'attempt_20111122213733305_0004_m_000000_0' to tip task_20111122213733305_0004_m_000000, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:50 INFO mapred.JobInProgress: Choosing rack-local task task_20111122213733305_0004_m_000000 11/11/22 21:38:50 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0004_m_000000_0 task's state:UNASSIGNED 11/11/22 21:38:50 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0004_m_000000_0 which needs 1 slots 11/11/22 21:38:50 INFO mapred.TaskTracker: Received KillTaskAction for task: attempt_20111122213733305_0004_m_000002_0 11/11/22 21:38:50 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0004_m_000000_0 which needs 1 slots 11/11/22 21:38:50 INFO mapred.TaskTracker: About to purge task: attempt_20111122213733305_0004_m_000002_0 11/11/22 21:38:50 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:50 INFO mapred.TaskRunner: attempt_20111122213733305_0004_m_000002_0 done; removing files. 11/11/22 21:38:50 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0004_m_000002_0 not found in cache 11/11/22 21:38:51 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0004_m_-1127020952 11/11/22 21:38:51 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0004_m_-1127020952 spawned. 11/11/22 21:38:51 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0004_m_000002_0, Status : SUCCEEDED attempt_20111122213733305_0004_m_000002_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0004_m_000002_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0004_m_000002_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0004_m_000002_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:38:51 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0004_m_-1127020952 given task: attempt_20111122213733305_0004_m_000000_0 11/11/22 21:38:52 INFO mapred.TaskTracker: Task attempt_20111122213733305_0004_m_000000_0 is in commit-pending, task state:COMMIT_PENDING 11/11/22 21:38:52 INFO mapred.TaskTracker: attempt_20111122213733305_0004_m_000000_0 0.0% 11/11/22 21:38:53 INFO mapred.TaskTracker: Received commit task action for attempt_20111122213733305_0004_m_000000_0 11/11/22 21:38:54 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0004_m_1596677326 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:38:54 INFO mapred.TaskTracker: attempt_20111122213733305_0004_m_000000_0 1.0% 11/11/22 21:38:54 INFO mapred.TaskTracker: Task attempt_20111122213733305_0004_m_000000_0 is done. 11/11/22 21:38:54 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0004_m_000000_0 was -1 11/11/22 21:38:54 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:38:54 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12339: No such process 11/11/22 21:38:54 INFO util.ProcessTree: Killing all processes in the process group 12339 with SIGTERM. Exit code 1 11/11/22 21:38:56 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0004_m_000000_0' has completed task_20111122213733305_0004_m_000000 successfully. 11/11/22 21:38:56 INFO mapred.JobTracker: Adding task (JOB_CLEANUP) 'attempt_20111122213733305_0004_m_000001_0' to tip task_20111122213733305_0004_m_000001, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:38:56 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0004_m_000001_0 task's state:UNASSIGNED 11/11/22 21:38:56 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0004_m_000001_0 which needs 1 slots 11/11/22 21:38:56 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0004_m_000001_0 which needs 1 slots 11/11/22 21:38:56 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:38:57 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0004_m_-931685464 11/11/22 21:38:57 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0004_m_-931685464 spawned. 11/11/22 21:38:57 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0004_m_000000_0, Status : SUCCEEDED attempt_20111122213733305_0004_m_000000_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0004_m_000000_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0004_m_000000_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0004_m_000000_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:38:57 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0004_m_-931685464 given task: attempt_20111122213733305_0004_m_000001_0 11/11/22 21:38:57 INFO mapred.TaskTracker: attempt_20111122213733305_0004_m_000001_0 0.0% 11/11/22 21:38:58 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48602 11/11/22 21:38:58 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48602 11/11/22 21:38:58 INFO server.NIOServerCnxn: Established session 0x133cd36705c0014 with negotiated timeout 40000 for client /127.0.0.1:48602 11/11/22 21:38:58 INFO mapred.JobClient: map 100% reduce 0% 11/11/22 21:38:59 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0004_m_-1127020952 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:39:00 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48604 11/11/22 21:39:00 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48604 11/11/22 21:39:00 INFO server.NIOServerCnxn: Established session 0x133cd36705c0015 with negotiated timeout 40000 for client /127.0.0.1:48604 11/11/22 21:39:00 INFO regionserver.Store: Validating hfile at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision/hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/REVISION_1_hfiles/my_family/2130252027373800299 for inclusion in store my_family region hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db. 11/11/22 21:39:00 INFO regionserver.Store: Renaming bulk load file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision/hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/REVISION_1_hfiles/my_family/2130252027373800299 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/831faae27c3129e2cc4fca71efd176db/my_family/8938881475959867683 11/11/22 21:39:00 INFO regionserver.Store: Moved hfile file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision/hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/REVISION_1_hfiles/my_family/2130252027373800299 into store directory file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/831faae27c3129e2cc4fca71efd176db/my_family - updating store file list. 11/11/22 21:39:00 INFO regionserver.Store: Successfully loaded store file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithRevision/DB_hbaseBulkOutputStorageDriverTestWithRevision/hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/REVISION_1_hfiles/my_family/2130252027373800299 into store my_family (new location: file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314/831faae27c3129e2cc4fca71efd176db/my_family/8938881475959867683) 11/11/22 21:39:00 INFO mapred.TaskTracker: attempt_20111122213733305_0004_m_000001_0 0.0% cleanup 11/11/22 21:39:00 INFO mapred.TaskTracker: Task attempt_20111122213733305_0004_m_000001_0 is done. 11/11/22 21:39:00 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0004_m_000001_0 was -1 11/11/22 21:39:00 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:39:00 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0014, likely client has closed socket 11/11/22 21:39:00 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48602 which had sessionid 0x133cd36705c0014 11/11/22 21:39:00 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0015, likely client has closed socket 11/11/22 21:39:00 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48604 which had sessionid 0x133cd36705c0015 11/11/22 21:39:00 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12379: No such process 11/11/22 21:39:00 INFO util.ProcessTree: Killing all processes in the process group 12379 with SIGTERM. Exit code 1 11/11/22 21:39:03 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0004_m_000001_0' has completed task_20111122213733305_0004_m_000001 successfully. 11/11/22 21:39:03 INFO mapred.JobInProgress: Job job_20111122213733305_0004 has completed successfully. 11/11/22 21:39:03 INFO mapred.JobInProgress$JobSummary: jobId=job_20111122213733305_0004,submitTime=1321997927132,launchTime=1321997927340,finishTime=1321997943007,numMaps=1,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=hortonas,queue=default,status=SUCCEEDED,mapSlotSeconds=7,reduceSlotsSeconds=0,clusterMapCapacity=2,clusterReduceCapacity=2 11/11/22 21:39:03 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0004_hortonas_hbaseBulkOutputStorageDriverTestWithRevision to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:39:03 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0004_m_000000_0' 11/11/22 21:39:03 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0004_m_000001_0' 11/11/22 21:39:03 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0004_m_000002_0' 11/11/22 21:39:03 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20111122213733305_0004 11/11/22 21:39:03 INFO mapred.TaskRunner: attempt_20111122213733305_0004_m_000001_0 done; removing files. 11/11/22 21:39:03 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0004_m_000001_0 not found in cache 11/11/22 21:39:03 INFO mapred.TaskRunner: attempt_20111122213733305_0004_m_000000_0 done; removing files. 11/11/22 21:39:03 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0004_m_000000_0 not found in cache 11/11/22 21:39:03 INFO mapred.UserLogCleaner: Adding job_20111122213733305_0004 for user-log deletion with retainTimeStamp:1322084343046 11/11/22 21:39:03 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0004_conf.xml to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:39:03 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0004_m_000001_0, Status : SUCCEEDED attempt_20111122213733305_0004_m_000001_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0004_m_000001_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0004_m_000001_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0004_m_000001_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:39:03 INFO mapred.JobClient: Job complete: job_20111122213733305_0004 11/11/22 21:39:03 INFO mapred.JobClient: Counters: 12 11/11/22 21:39:03 INFO mapred.JobClient: Job Counters 11/11/22 21:39:03 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=7514 11/11/22 21:39:03 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 11/11/22 21:39:03 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 11/11/22 21:39:03 INFO mapred.JobClient: Rack-local map tasks=1 11/11/22 21:39:03 INFO mapred.JobClient: Launched map tasks=1 11/11/22 21:39:03 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 11/11/22 21:39:03 INFO mapred.JobClient: FileSystemCounters 11/11/22 21:39:03 INFO mapred.JobClient: FILE_BYTES_READ=335 11/11/22 21:39:03 INFO mapred.JobClient: FILE_BYTES_WRITTEN=554 11/11/22 21:39:03 INFO mapred.JobClient: Map-Reduce Framework 11/11/22 21:39:03 INFO mapred.JobClient: Map input records=3 11/11/22 21:39:03 INFO mapred.JobClient: Spilled Records=0 11/11/22 21:39:03 INFO mapred.JobClient: Map output records=3 11/11/22 21:39:03 INFO mapred.JobClient: SPLIT_RAW_BYTES=225 11/11/22 21:39:03 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:39:03 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:39:03 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48606 11/11/22 21:39:03 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:39:03 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48606 11/11/22 21:39:03 INFO server.NIOServerCnxn: Established session 0x133cd36705c0016 with negotiated timeout 40000 for client /127.0.0.1:48606 11/11/22 21:39:03 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0016, negotiated timeout = 40000 Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_775558420.txt 11/11/22 21:39:03 INFO exec.HiveHistory: Hive history file=/tmp/hortonas/hive_job_log_hortonas_201111222137_775558420.txt 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO parse.ParseDriver: Parsing command: CREATE DATABASE IF NOT EXISTS default LOCATION '/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithDefaultDB/DB_hbaseBulkOutputStorageDriverTestWithDefaultDB' 11/11/22 21:39:03 INFO parse.ParseDriver: Parse Completed 11/11/22 21:39:03 INFO metastore.HiveMetaStore: 0: get_databases: default 11/11/22 21:39:03 INFO ql.Driver: Semantic Analysis Completed 11/11/22 21:39:03 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null) 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO ql.Driver: Starting command: CREATE DATABASE IF NOT EXISTS default LOCATION '/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithDefaultDB/DB_hbaseBulkOutputStorageDriverTestWithDefaultDB' 11/11/22 21:39:03 INFO metastore.HiveMetaStore: 0: create_database: default /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbaseBulkOutputStorageDriverTestWithDefaultDB/DB_hbaseBulkOutputStorageDriverTestWithDefaultDB null 11/11/22 21:39:03 INFO metastore.HiveMetaStore: 0: get_database: default 11/11/22 21:39:03 INFO ql.Driver: OK 11/11/22 21:39:03 INFO ql.Driver: OK 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO parse.ParseDriver: Parsing command: CREATE TABLE default.hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003(key int, english string, spanish string) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler'TBLPROPERTIES ('hcat.isd'='org.apache.hcatalog.hbase.HBaseInputStorageDriver', 'hcat.osd'='org.apache.hcatalog.hbase.HBaseOutputStorageDriver','hbase.columns.mapping'=':key,my_family:english,my_family:spanish') 11/11/22 21:39:03 INFO parse.ParseDriver: Parse Completed 11/11/22 21:39:03 INFO parse.SemanticAnalyzer: Starting Semantic Analysis 11/11/22 21:39:03 INFO parse.SemanticAnalyzer: Creating table default.hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 position=13 11/11/22 21:39:03 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 11/11/22 21:39:03 ERROR metadata.Hive: NoSuchObjectException(message:default.hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 table not found) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1183) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:348) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1178) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843) at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:252) at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:186) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889) at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42) at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputStorageDriverTestWithDefaultDB(TestHBaseBulkOutputStorageDriver.java:470) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:518) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:1052) at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:906) 11/11/22 21:39:03 INFO ql.Driver: Semantic Analysis Completed 11/11/22 21:39:03 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null) 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO ql.Driver: 11/11/22 21:39:03 INFO ql.Driver: Starting command: CREATE TABLE default.hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003(key int, english string, spanish string) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler'TBLPROPERTIES ('hcat.isd'='org.apache.hcatalog.hbase.HBaseInputStorageDriver', 'hcat.osd'='org.apache.hcatalog.hbase.HBaseOutputStorageDriver','hbase.columns.mapping'=':key,my_family:english,my_family:spanish') 11/11/22 21:39:03 INFO exec.DDLTask: Use StorageHandler-supplied org.apache.hadoop.hive.hbase.HBaseSerDe for table default.hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 11/11/22 21:39:03 INFO hive.log: DDL: struct hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 { i32 key, string english, string spanish} 11/11/22 21:39:03 INFO hive.log: DDL: struct hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 { i32 key, string english, string spanish} 11/11/22 21:39:03 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:39:03 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:39:03 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48608 11/11/22 21:39:03 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:39:03 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48608 11/11/22 21:39:03 INFO server.NIOServerCnxn: Established session 0x133cd36705c0017 with negotiated timeout 40000 for client /127.0.0.1:48608 11/11/22 21:39:03 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0017, negotiated timeout = 40000 11/11/22 21:39:03 WARN zookeeper.ZKTable: Moving table hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 state to enabled but was already enabled 11/11/22 21:39:03 INFO server.PrepRequestProcessor: Got user-level KeeperException when processing sessionid:0x133cd36705c0000 type:delete cxid:0x60 zxid:0xfffffffffffffffe txntype:unknown reqpath:n/a Error Path:/hbase/table/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 Error:KeeperErrorCode = NoNode for /hbase/table/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 11/11/22 21:39:03 INFO wal.HLog: HLog configuration: blocksize=32 MB, rollsize=30.4 MB, enabled=true, flushlogentries=1, optionallogflushinternal=1000ms 11/11/22 21:39:03 INFO wal.SequenceFileLogWriter: syncFs -- HDFS-200 -- not available, dfs.support.append=false 11/11/22 21:39:03 INFO wal.HLog: New hlog /homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/2d8956c1cb44ebb55f255ad7835bbeb1/.logs/hlog.1321997943422 11/11/22 21:39:03 INFO wal.HLog: getNumCurrentReplicas--HDFS-826 not available; hdfs_out=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer@180dd34, exception=org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.getNumCurrentReplicas() 11/11/22 21:39:03 INFO regionserver.HRegion: Onlined hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1.; next sequenceid=1 11/11/22 21:39:03 INFO catalog.MetaEditor: Added region hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1. to META 11/11/22 21:39:03 INFO regionserver.HRegion: Closed hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1. 11/11/22 21:39:03 INFO wal.HLog: IPC Server handler 0 on 51911.logSyncer exiting 11/11/22 21:39:03 INFO master.AssignmentManager: Bulk assigning 1 region(s) round-robin across 1 server(s) 11/11/22 21:39:03 INFO master.AssignmentManager: Bulk assigning done 11/11/22 21:39:03 INFO master.AssignmentManager: hrt7n35.cc1.ygridcore.net,54808,1321997857252 unassigned znodes=1 of total=1 11/11/22 21:39:03 INFO regionserver.HRegionServer: Received request to open 1 region(s) 11/11/22 21:39:03 INFO regionserver.HRegionServer: Received request to open region: hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1. 11/11/22 21:39:03 INFO regionserver.HRegion: Onlined hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1.; next sequenceid=1 11/11/22 21:39:03 INFO catalog.MetaEditor: Updated row hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1. in region .META.,,1 with server=hrt7n35.cc1.ygridcore.net:54808, startcode=1321997857252 11/11/22 21:39:04 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:39:04 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:39:04 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48609 11/11/22 21:39:04 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:39:04 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48609 11/11/22 21:39:04 INFO server.NIOServerCnxn: Established session 0x133cd36705c0018 with negotiated timeout 40000 for client /127.0.0.1:48609 11/11/22 21:39:04 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c0018, negotiated timeout = 40000 11/11/22 21:39:04 INFO metastore.HiveMetaStore: 0: create_table: db=default tbl=hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 11/11/22 21:39:04 INFO ql.Driver: OK 11/11/22 21:39:04 INFO ql.Driver: OK 11/11/22 21:39:04 INFO ql.Driver: 11/11/22 21:39:04 INFO ql.Driver: 11/11/22 21:39:04 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003 11/11/22 21:39:04 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 11/11/22 21:39:04 INFO metastore.ObjectStore: ObjectStore, initialize called 11/11/22 21:39:04 INFO metastore.ObjectStore: Initialized ObjectStore 11/11/22 21:39:04 INFO metastore.HiveMetaStore: 0: Shutting down the object store... 11/11/22 21:39:04 INFO metastore.HiveMetaStore: 0: Metastore shutdown complete. 11/11/22 21:39:04 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/11/22 21:39:04 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 11/11/22 21:39:04 INFO input.FileInputFormat: Total input paths to process : 1 11/11/22 21:39:04 INFO mapred.JobTracker: Job job_20111122213733305_0005 added successfully for user 'hortonas' to queue 'default' 11/11/22 21:39:04 INFO mapred.JobTracker: Initializing job_20111122213733305_0005 11/11/22 21:39:04 INFO mapred.AuditLogger: USER=hortonas IP=127.0.0.1 OPERATION=SUBMIT_JOB TARGET=job_20111122213733305_0005 RESULT=SUCCESS 11/11/22 21:39:04 INFO mapred.JobInProgress: Initializing job_20111122213733305_0005 11/11/22 21:39:04 INFO mapred.JobClient: Running job: job_20111122213733305_0005 11/11/22 21:39:05 INFO mapred.JobInProgress: jobToken generated and stored with users keys in /tmp/hadoop-hortonas/mapred/system/job_20111122213733305_0005/jobToken 11/11/22 21:39:05 INFO mapred.JobInProgress: Input size for job job_20111122213733305_0005 = 79. Number of splits = 1 11/11/22 21:39:05 INFO mapred.JobInProgress: tip:task_20111122213733305_0005_m_000000 has split on node:/default-rack/localhost 11/11/22 21:39:05 INFO mapred.JobInProgress: Job job_20111122213733305_0005 initialized successfully with 1 map tasks and 0 reduce tasks. 11/11/22 21:39:05 INFO mapred.JobClient: map 0% reduce 0% 11/11/22 21:39:05 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0004_m_-931685464 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:39:06 INFO mapred.JobTracker: Adding task (JOB_SETUP) 'attempt_20111122213733305_0005_m_000002_0' to tip task_20111122213733305_0005_m_000002, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:39:06 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0005_m_000002_0 task's state:UNASSIGNED 11/11/22 21:39:06 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0005_m_000002_0 which needs 1 slots 11/11/22 21:39:06 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0005_m_000002_0 which needs 1 slots 11/11/22 21:39:06 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:39:06 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0005_m_1671975024 11/11/22 21:39:06 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0005_m_1671975024 spawned. 11/11/22 21:39:06 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0005_m_1671975024 given task: attempt_20111122213733305_0005_m_000002_0 11/11/22 21:39:07 INFO mapred.TaskTracker: attempt_20111122213733305_0005_m_000002_0 0.0% setup 11/11/22 21:39:07 INFO mapred.TaskTracker: Task attempt_20111122213733305_0005_m_000002_0 is done. 11/11/22 21:39:07 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0005_m_000002_0 was -1 11/11/22 21:39:07 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:39:07 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12609: No such process 11/11/22 21:39:07 INFO util.ProcessTree: Killing all processes in the process group 12609 with SIGTERM. Exit code 1 11/11/22 21:39:09 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0005_m_000002_0' has completed task_20111122213733305_0005_m_000002 successfully. 11/11/22 21:39:09 INFO mapred.JobTracker: Adding task (MAP) 'attempt_20111122213733305_0005_m_000000_0' to tip task_20111122213733305_0005_m_000000, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:39:09 INFO mapred.JobInProgress: Choosing rack-local task task_20111122213733305_0005_m_000000 11/11/22 21:39:09 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0005_m_000000_0 task's state:UNASSIGNED 11/11/22 21:39:09 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0005_m_000000_0 which needs 1 slots 11/11/22 21:39:09 INFO mapred.TaskTracker: Received KillTaskAction for task: attempt_20111122213733305_0005_m_000002_0 11/11/22 21:39:09 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0005_m_000000_0 which needs 1 slots 11/11/22 21:39:09 INFO mapred.TaskTracker: About to purge task: attempt_20111122213733305_0005_m_000002_0 11/11/22 21:39:09 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:39:09 INFO mapred.TaskRunner: attempt_20111122213733305_0005_m_000002_0 done; removing files. 11/11/22 21:39:09 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0005_m_000002_0 not found in cache 11/11/22 21:39:09 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0005_m_-1162386445 11/11/22 21:39:09 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0005_m_-1162386445 spawned. 11/11/22 21:39:09 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0005_m_-1162386445 given task: attempt_20111122213733305_0005_m_000000_0 11/11/22 21:39:09 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0005_m_000002_0, Status : SUCCEEDED attempt_20111122213733305_0005_m_000002_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0005_m_000002_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0005_m_000002_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0005_m_000002_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:39:10 INFO mapred.TaskTracker: Task attempt_20111122213733305_0005_m_000000_0 is in commit-pending, task state:COMMIT_PENDING 11/11/22 21:39:10 INFO mapred.TaskTracker: attempt_20111122213733305_0005_m_000000_0 0.0% 11/11/22 21:39:12 INFO mapred.TaskTracker: Received commit task action for attempt_20111122213733305_0005_m_000000_0 11/11/22 21:39:12 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0005_m_1671975024 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:39:12 INFO mapred.TaskTracker: attempt_20111122213733305_0005_m_000000_0 1.0% 11/11/22 21:39:12 INFO mapred.TaskTracker: Task attempt_20111122213733305_0005_m_000000_0 is done. 11/11/22 21:39:12 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0005_m_000000_0 was -1 11/11/22 21:39:12 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:39:12 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12649: No such process 11/11/22 21:39:12 INFO util.ProcessTree: Killing all processes in the process group 12649 with SIGTERM. Exit code 1 11/11/22 21:39:15 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0005_m_000000_0' has completed task_20111122213733305_0005_m_000000 successfully. 11/11/22 21:39:15 INFO mapred.JobTracker: Adding task (JOB_CLEANUP) 'attempt_20111122213733305_0005_m_000001_0' to tip task_20111122213733305_0005_m_000001, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:51131' 11/11/22 21:39:15 INFO mapred.TaskTracker: LaunchTaskAction (registerTask): attempt_20111122213733305_0005_m_000001_0 task's state:UNASSIGNED 11/11/22 21:39:15 INFO mapred.TaskTracker: Trying to launch : attempt_20111122213733305_0005_m_000001_0 which needs 1 slots 11/11/22 21:39:15 INFO mapred.TaskTracker: In TaskLauncher, current free slots : 2 and trying to launch attempt_20111122213733305_0005_m_000001_0 which needs 1 slots 11/11/22 21:39:15 INFO tasktracker.Localizer: User-directories for the user hortonas are already initialized on this TT. Not doing anything. 11/11/22 21:39:15 INFO mapred.JvmManager: In JvmRunner constructed JVM ID: jvm_20111122213733305_0005_m_-715593818 11/11/22 21:39:15 INFO mapred.JvmManager: JVM Runner jvm_20111122213733305_0005_m_-715593818 spawned. 11/11/22 21:39:15 INFO mapred.TaskTracker: JVM with ID: jvm_20111122213733305_0005_m_-715593818 given task: attempt_20111122213733305_0005_m_000001_0 11/11/22 21:39:15 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0005_m_000000_0, Status : SUCCEEDED attempt_20111122213733305_0005_m_000000_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0005_m_000000_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0005_m_000000_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0005_m_000000_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:39:16 INFO mapred.TaskTracker: attempt_20111122213733305_0005_m_000001_0 0.0% 11/11/22 21:39:16 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48618 11/11/22 21:39:16 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48618 11/11/22 21:39:16 INFO server.NIOServerCnxn: Established session 0x133cd36705c0019 with negotiated timeout 40000 for client /127.0.0.1:48618 11/11/22 21:39:16 INFO mapred.JobClient: map 100% reduce 0% 11/11/22 21:39:17 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0005_m_-1162386445 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:39:18 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48620 11/11/22 21:39:18 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48620 11/11/22 21:39:18 INFO server.NIOServerCnxn: Established session 0x133cd36705c001a with negotiated timeout 40000 for client /127.0.0.1:48620 11/11/22 21:39:18 INFO regionserver.Store: Validating hfile at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/warehouse/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/REVISION_1321997944747_hfiles/my_family/2990918927448993830 for inclusion in store my_family region hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1. 11/11/22 21:39:18 INFO regionserver.Store: Renaming bulk load file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/warehouse/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/REVISION_1321997944747_hfiles/my_family/2990918927448993830 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/2d8956c1cb44ebb55f255ad7835bbeb1/my_family/8365365183598767497 11/11/22 21:39:18 INFO regionserver.Store: Moved hfile file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/warehouse/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/REVISION_1321997944747_hfiles/my_family/2990918927448993830 into store directory file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/2d8956c1cb44ebb55f255ad7835bbeb1/my_family - updating store file list. 11/11/22 21:39:18 INFO regionserver.Store: Successfully loaded store file file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/warehouse/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/REVISION_1321997944747_hfiles/my_family/2990918927448993830 into store my_family (new location: file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003/2d8956c1cb44ebb55f255ad7835bbeb1/my_family/8365365183598767497) 11/11/22 21:39:18 INFO mapred.TaskTracker: attempt_20111122213733305_0005_m_000001_0 0.0% cleanup 11/11/22 21:39:18 INFO mapred.TaskTracker: Task attempt_20111122213733305_0005_m_000001_0 is done. 11/11/22 21:39:18 INFO mapred.TaskTracker: reported output size for attempt_20111122213733305_0005_m_000001_0 was -1 11/11/22 21:39:18 INFO mapred.TaskTracker: addFreeSlot : current free slots : 2 11/11/22 21:39:18 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0019, likely client has closed socket 11/11/22 21:39:18 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48618 which had sessionid 0x133cd36705c0019 11/11/22 21:39:18 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c001a, likely client has closed socket 11/11/22 21:39:18 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48620 which had sessionid 0x133cd36705c001a 11/11/22 21:39:18 WARN util.ProcessTree: Error executing shell command org.apache.hadoop.util.Shell$ExitCodeException: kill -12690: No such process 11/11/22 21:39:18 INFO util.ProcessTree: Killing all processes in the process group 12690 with SIGTERM. Exit code 1 11/11/22 21:39:21 INFO mapred.JobInProgress: Task 'attempt_20111122213733305_0005_m_000001_0' has completed task_20111122213733305_0005_m_000001 successfully. 11/11/22 21:39:21 INFO mapred.JobInProgress: Job job_20111122213733305_0005 has completed successfully. 11/11/22 21:39:21 INFO mapred.JobInProgress$JobSummary: jobId=job_20111122213733305_0005,submitTime=1321997944764,launchTime=1321997945003,finishTime=1321997961065,numMaps=1,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=hortonas,queue=default,status=SUCCEEDED,mapSlotSeconds=7,reduceSlotsSeconds=0,clusterMapCapacity=2,clusterReduceCapacity=2 11/11/22 21:39:21 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0005_hortonas_hbaseBulkOutputStorageDriverTestWithDefaultDB to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:39:21 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0005_m_000000_0' 11/11/22 21:39:21 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0005_m_000001_0' 11/11/22 21:39:21 INFO mapred.JobTracker: Removing task 'attempt_20111122213733305_0005_m_000002_0' 11/11/22 21:39:21 INFO mapred.TaskTracker: Received 'KillJobAction' for job: job_20111122213733305_0005 11/11/22 21:39:21 INFO mapred.TaskRunner: attempt_20111122213733305_0005_m_000001_0 done; removing files. 11/11/22 21:39:21 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0005_m_000001_0 not found in cache 11/11/22 21:39:21 INFO mapred.TaskRunner: attempt_20111122213733305_0005_m_000000_0 done; removing files. 11/11/22 21:39:21 INFO mapred.IndexCache: Map ID attempt_20111122213733305_0005_m_000000_0 not found in cache 11/11/22 21:39:21 INFO mapred.UserLogCleaner: Adding job_20111122213733305_0005 for user-log deletion with retainTimeStamp:1322084361093 11/11/22 21:39:21 INFO mapred.JobHistory: Moving file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/localhost_1321997853671_job_20111122213733305_0005_conf.xml to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/logs/history/done 11/11/22 21:39:21 INFO mapred.JobClient: Task Id : attempt_20111122213733305_0005_m_000001_0, Status : SUCCEEDED attempt_20111122213733305_0005_m_000001_0: SLF4J: Class path contains multiple SLF4J bindings. attempt_20111122213733305_0005_m_000001_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0005_m_000001_0: SLF4J: Found binding in [jar:file:/homes/hortonas/hcat/hcat-trunk/hive/external/build/hadoopcore/hadoop-0.20.3-CDH3-SNAPSHOT/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] attempt_20111122213733305_0005_m_000001_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 11/11/22 21:39:21 INFO mapred.JobClient: Job complete: job_20111122213733305_0005 11/11/22 21:39:21 INFO mapred.JobClient: Counters: 12 11/11/22 21:39:21 INFO mapred.JobClient: Job Counters 11/11/22 21:39:21 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=7525 11/11/22 21:39:21 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 11/11/22 21:39:21 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 11/11/22 21:39:21 INFO mapred.JobClient: Rack-local map tasks=1 11/11/22 21:39:21 INFO mapred.JobClient: Launched map tasks=1 11/11/22 21:39:21 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 11/11/22 21:39:21 INFO mapred.JobClient: FileSystemCounters 11/11/22 21:39:21 INFO mapred.JobClient: FILE_BYTES_READ=336 11/11/22 21:39:21 INFO mapred.JobClient: FILE_BYTES_WRITTEN=554 11/11/22 21:39:21 INFO mapred.JobClient: Map-Reduce Framework 11/11/22 21:39:21 INFO mapred.JobClient: Map input records=3 11/11/22 21:39:21 INFO mapred.JobClient: Spilled Records=0 11/11/22 21:39:21 INFO mapred.JobClient: Map output records=3 11/11/22 21:39:21 INFO mapred.JobClient: SPLIT_RAW_BYTES=226 11/11/22 21:39:21 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=127.0.0.1:44589 sessionTimeout=180000 watcher=hconnection 11/11/22 21:39:21 INFO zookeeper.ClientCnxn: Opening socket connection to server /127.0.0.1:44589 11/11/22 21:39:21 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:44589, initiating session 11/11/22 21:39:21 INFO server.NIOServerCnxn: Accepted socket connection from /127.0.0.1:48622 11/11/22 21:39:21 INFO server.NIOServerCnxn: Client attempting to establish new session at /127.0.0.1:48622 11/11/22 21:39:21 INFO server.NIOServerCnxn: Established session 0x133cd36705c001b with negotiated timeout 40000 for client /127.0.0.1:48622 11/11/22 21:39:21 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:44589, sessionid = 0x133cd36705c001b, negotiated timeout = 40000 11/11/22 21:39:21 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0001 11/11/22 21:39:21 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0001 11/11/22 21:39:21 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0001 closed 11/11/22 21:39:21 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0004 11/11/22 21:39:21 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48539 which had sessionid 0x133cd36705c0001 11/11/22 21:39:21 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0004 11/11/22 21:39:21 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0004 closed 11/11/22 21:39:21 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0005 11/11/22 21:39:21 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48543 which had sessionid 0x133cd36705c0004 11/11/22 21:39:21 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0005 11/11/22 21:39:21 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0005 closed 11/11/22 21:39:21 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0008 11/11/22 21:39:21 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48546 which had sessionid 0x133cd36705c0005 11/11/22 21:39:21 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0008 11/11/22 21:39:21 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0008 closed 11/11/22 21:39:21 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0009 11/11/22 21:39:21 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48560 which had sessionid 0x133cd36705c0008 11/11/22 21:39:21 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0009 11/11/22 21:39:21 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0009 closed 11/11/22 21:39:21 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c000a 11/11/22 21:39:21 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48562 which had sessionid 0x133cd36705c0009 11/11/22 21:39:21 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c000a 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c000a closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c000b 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48571 which had sessionid 0x133cd36705c000a 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c000b 11/11/22 21:39:22 INFO server.ZooKeeperServer: Expiring session 0x133cd36705c000f, timeout of 40000ms exceeded 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c000f 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c000b closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c000c 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48573 which had sessionid 0x133cd36705c000b 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c000c 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c000c closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c000d 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48574 which had sessionid 0x133cd36705c000c 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c000d 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c000d closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c000e 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48575 which had sessionid 0x133cd36705c000d 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c000e 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c000e closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0011 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48576 which had sessionid 0x133cd36705c000e 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0011 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0011 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0012 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48590 which had sessionid 0x133cd36705c0011 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0012 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0012 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0013 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48592 which had sessionid 0x133cd36705c0012 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0013 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0013 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0016 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48593 which had sessionid 0x133cd36705c0013 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0016 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0016 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0003 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48606 which had sessionid 0x133cd36705c0016 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0003 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0003 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0017 11/11/22 21:39:22 WARN server.NIOServerCnxn: EndOfStreamException: Unable to read additional data from client sessionid 0x133cd36705c0003, likely client has closed socket 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48541 which had sessionid 0x133cd36705c0003 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0017 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0017 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c0018 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48608 which had sessionid 0x133cd36705c0017 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0018 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0018 closed 11/11/22 21:39:22 INFO client.HConnectionManager$HConnectionImplementation: Closed zookeeper sessionid=0x133cd36705c001b 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48609 which had sessionid 0x133cd36705c0018 11/11/22 21:39:22 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c001b 11/11/22 21:39:22 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c001b closed 11/11/22 21:39:22 INFO master.HMaster: Cluster shutdown requested 11/11/22 21:39:22 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48622 which had sessionid 0x133cd36705c001b 11/11/22 21:39:22 INFO master.ServerManager: Waiting on regionserver(s) to go down hrt7n35.cc1.ygridcore.net,54808,1321997857252 11/11/22 21:39:22 INFO master.HMaster$1: hrt7n35.cc1.ygridcore.net:51911-BalancerChore exiting 11/11/22 21:39:22 INFO master.CatalogJanitor: hrt7n35.cc1.ygridcore.net:51911-CatalogJanitor exiting 11/11/22 21:39:23 INFO regionserver.HRegion: Closed hbasebulkoutputstoragedrivertest.hbasebulkoutputstoragedrivertest_885980685671171921,,1321997905575.7f155566e4f3b455c051b7d5022120be. 11/11/22 21:39:23 INFO regionserver.HRegion: Closed importsequencefiletest_6277872447333260599,,1321997886795.870cba3a31103028ca4d6c05066d8928. 11/11/22 21:39:23 INFO regionserver.HRegion: Closed hbasebulkoutputstoragedrivertestwithrevision.hbasebulkoutputstoragedrivertestwithrevision_2050865341583007314,,1321997925728.831faae27c3129e2cc4fca71efd176db. 11/11/22 21:39:23 INFO regionserver.HRegion: Closed hbasebulkoutputstoragedrivertestwithdefaultdb_6545935884595718003,,1321997943392.2d8956c1cb44ebb55f255ad7835bbeb1. 11/11/22 21:39:23 INFO regionserver.HRegion: Closed hbasebulkoutputformattest_74574805078408209,,1321997868042.18c77e39c9e413bede9dd808dcb69bae. 11/11/22 21:39:23 INFO regionserver.HRegionServer: STOP_REGIONSERVER 11/11/22 21:39:23 INFO regionserver.HRegionServer: STOPPED: Received STOP_REGIONSERVER 11/11/22 21:39:23 INFO ipc.HBaseServer: Stopping server on 54808 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 0 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 1 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 0 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 2 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 2 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 6 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 4 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 3 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 7 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 8 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 4 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 5 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 5 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 1 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 6 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 3 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 7 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: Stopping IPC Server listener on 54808 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 8 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: PRI IPC Server handler 9 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 9 on 54808: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: Stopping IPC Server Responder 11/11/22 21:39:23 INFO regionserver.CompactSplitThread: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252.compactor exiting 11/11/22 21:39:23 INFO regionserver.HRegionServer$MajorCompactionChecker: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252.majorCompactionChecker exiting 11/11/22 21:39:23 INFO regionserver.MemStoreFlusher: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252.cacheFlusher exiting 11/11/22 21:39:23 INFO regionserver.LogRoller: LogRoller exiting. 11/11/22 21:39:23 INFO wal.HLog: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252.logSyncer exiting 11/11/22 21:39:23 INFO regionserver.Store: Renaming flushed file at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/.META./1028785192/.tmp/2817286928333584636 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/.META./1028785192/info/2889536291236041708 11/11/22 21:39:23 INFO regionserver.Store: Added file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/.META./1028785192/info/2889536291236041708, entries=15, sequenceid=15, memsize=6.4k, filesize=5.2k 11/11/22 21:39:23 INFO regionserver.HRegion: Closed .META.,,1.1028785192 11/11/22 21:39:23 INFO regionserver.Store: Renaming flushed file at file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/.tmp/8319694740855543735 to file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/info/4392089267985055484 11/11/22 21:39:23 INFO regionserver.Store: Added file:/homes/hortonas/hcat/hcat-trunk/storage-drivers/hbase/test_default_3510113083437113091/hbase/-ROOT-/70236052/info/4392089267985055484, entries=2, sequenceid=16, memsize=368.0, filesize=518.0 11/11/22 21:39:23 INFO regionserver.HRegion: Closed -ROOT-,,0.70236052 11/11/22 21:39:23 INFO regionserver.HRegionServer: stopping server at: hrt7n35.cc1.ygridcore.net,54808,1321997857252 11/11/22 21:39:23 INFO regionserver.Leases: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252 closing leases 11/11/22 21:39:23 INFO regionserver.Leases: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252 closed leases 11/11/22 21:39:23 INFO server.PrepRequestProcessor: Processed session termination for sessionid: 0x133cd36705c0002 11/11/22 21:39:23 INFO zookeeper.RegionServerTracker: RegionServer ephemeral node deleted, processing expiration [hrt7n35.cc1.ygridcore.net,54808,1321997857252] 11/11/22 21:39:23 INFO zookeeper.ZooKeeper: Session: 0x133cd36705c0002 closed 11/11/22 21:39:23 INFO master.ServerManager: Cluster shutdown set; hrt7n35.cc1.ygridcore.net,54808,1321997857252 expired; onlineServers=0 11/11/22 21:39:23 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48540 which had sessionid 0x133cd36705c0002 11/11/22 21:39:23 INFO regionserver.HRegionServer: RegionServer:0;hrt7n35.cc1.ygridcore.net,54808,1321997857252 exiting 11/11/22 21:39:23 INFO master.HMaster: Cluster shutdown set; onlineServer=0 11/11/22 21:39:23 INFO hbase.MiniHBaseCluster: Hook closing fs=org.apache.hadoop.fs.LocalFileSystem@2a5ab9 11/11/22 21:39:23 INFO util.JVMClusterUtil: Shutdown of 1 master(s) and 1 regionserver(s) complete 11/11/22 21:39:23 INFO server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:48538 which had sessionid 0x133cd36705c0000 11/11/22 21:39:23 INFO zookeeper.ClientCnxn: Unable to read additional data from server sessionid 0x133cd36705c0000, likely server has closed socket, closing socket connection and attempting reconnect 11/11/22 21:39:23 INFO server.NIOServerCnxn: NIOServerCnxn factory exited run method 11/11/22 21:39:23 INFO server.PrepRequestProcessor: PrepRequestProcessor exited loop! 11/11/22 21:39:23 INFO server.SyncRequestProcessor: SyncRequestProcessor exited! 11/11/22 21:39:23 INFO server.FinalRequestProcessor: shutdown of request processor complete 11/11/22 21:39:23 INFO util.AsyncDiskService: Shutting down all AsyncDiskService threads... 11/11/22 21:39:23 INFO util.AsyncDiskService: All AsyncDiskService threads are terminated. 11/11/22 21:39:23 INFO util.MRAsyncDiskService: Deleting toBeDeleted directory. 11/11/22 21:39:23 INFO mapred.TaskTracker: Shutting down: Map-events fetcher for all reduce tasks on tracker_host0.foo.com:localhost/127.0.0.1:51131 11/11/22 21:39:23 INFO ipc.HBaseServer: Stopping server on 51911 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 0 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 1 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 6 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: Stopping IPC Server listener on 51911 11/11/22 21:39:23 INFO master.LogCleaner: Master:0;hrt7n35.cc1.ygridcore.net:51911.oldLogCleaner exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 2 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 8 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: Stopping IPC Server Responder 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 3 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 9 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 5 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 7 on 51911: exiting 11/11/22 21:39:23 INFO ipc.HBaseServer: IPC Server handler 4 on 51911: exiting 11/11/22 21:39:23 INFO mapred.JvmManager: JVM : jvm_20111122213733305_0005_m_-715593818 exited with exit code 0. Number of tasks it ran: 1 11/11/22 21:39:23 INFO ipc.Server: Stopping server on 51131 11/11/22 21:39:23 INFO ipc.Server: IPC Server handler 0 on 51131: exiting 11/11/22 21:39:23 INFO ipc.Server: IPC Server handler 1 on 51131: exiting 11/11/22 21:39:23 INFO ipc.Server: IPC Server handler 2 on 51131: exiting 11/11/22 21:39:23 INFO ipc.Server: IPC Server handler 3 on 51131: exiting 11/11/22 21:39:23 INFO ipc.Server: Stopping IPC Server listener on 51131 11/11/22 21:39:23 INFO mapred.TaskTracker: Shutting down StatusHttpServer 11/11/22 21:39:23 INFO ipc.Server: Stopping IPC Server Responder 11/11/22 21:39:24 INFO server.SessionTrackerImpl: SessionTrackerImpl exited loop! 11/11/22 21:39:24 INFO mapred.TaskTracker: Interrupted. Closing down. 11/11/22 21:39:24 INFO util.AsyncDiskService: Shutting down all AsyncDiskService threads... 11/11/22 21:39:24 INFO util.AsyncDiskService: All AsyncDiskService threads are terminated. 11/11/22 21:39:24 INFO util.MRAsyncDiskService: Deleting toBeDeleted directory. 11/11/22 21:39:24 INFO mapred.JobTracker: Stopping pluginDispatcher 11/11/22 21:39:24 INFO mapred.JobTracker: Stopping infoServer 11/11/22 21:39:24 INFO mapred.JobTracker: Stopping interTrackerServer 11/11/22 21:39:24 INFO ipc.Server: Stopping server on 43643 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 0 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: Stopping IPC Server listener on 43643 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 1 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 9 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 2 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 4 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 6 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 5 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 8 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: Stopping IPC Server Responder 11/11/22 21:39:24 INFO mapred.JobTracker: Stopped interTrackerServer 11/11/22 21:39:24 INFO mapred.JobTracker: Stopping expireTrackers 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 7 on 43643: exiting 11/11/22 21:39:24 INFO ipc.Server: IPC Server handler 3 on 43643: exiting 11/11/22 21:39:24 INFO mapred.JobTracker: Stopping retirer 11/11/22 21:39:24 INFO mapred.EagerTaskInitializationListener: Stopping Job Init Manager thread 11/11/22 21:39:24 INFO mapred.EagerTaskInitializationListener: JobInitManagerThread interrupted. 11/11/22 21:39:24 INFO mapred.EagerTaskInitializationListener: Shutting down thread pool 11/11/22 21:39:24 INFO mapred.JobTracker: Stopping expireLaunchingTasks 11/11/22 21:39:24 INFO mapred.JobTracker: stopped all jobtracker services ------------- ---------------- --------------- Testcase: hbaseBulkOutputFormatTest took 18.838 sec FAILED junit.framework.AssertionFailedError: at org.apache.hcatalog.hbase.TestHBaseBulkOutputStorageDriver.hbaseBulkOutputFormatTest(TestHBaseBulkOutputStorageDriver.java:173) Testcase: importSequenceFileTest took 16.861 sec Testcase: hbaseBulkOutputStorageDriverTest took 22.017 sec Testcase: hbaseBulkOutputStorageDriverTestWithRevision took 17.669 sec Testcase: hbaseBulkOutputStorageDriverTestWithDefaultDB took 18.635 sec