Buildfile: build.xml clean-test: clean-test: clean-test: clean-test: clean-test: clean-test: clean-test: clean-test: clean-test: clean-test: clean-test: jar: create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jexl/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/test/classes compile-ant-tasks: create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/test/classes init: compile: [echo] Compiling: anttasks [javac] Compiling 2 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/classes [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: [copy] Copying 1 file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/classes/org/apache/hadoop/hive/ant [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/hive_anttasks.jar init: compile: download-ivy: init-ivy: settings-ivy: resolve: [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 186ms :: artifacts dl 14ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 4 artifacts copied, 0 already retrieved (127160kB/562ms) install-hadoopcore-internal: [untar] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.17.2.1.tar.gz into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore [touch] Creating /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.17.2.1.installed build_shims: [echo] Compiling shims against hadoop 0.17.2.1 (/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.17.2.1) [javac] Compiling 5 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 98ms :: artifacts dl 51ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/5ms) install-hadoopcore-internal: [untar] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.18.3.tar.gz into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore [touch] Creating /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.18.3.installed build_shims: [echo] Compiling shims against hadoop 0.18.3 (/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.18.3) [javac] Compiling 2 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 90ms :: artifacts dl 6ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/5ms) install-hadoopcore-internal: [untar] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.19.0.tar.gz into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore [touch] Creating /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.19.0.installed build_shims: [echo] Compiling shims against hadoop 0.19.0 (/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.19.0) [javac] Compiling 2 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 78ms :: artifacts dl 7ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/4ms) install-hadoopcore-internal: [untar] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.20.0.tar.gz into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore [touch] Creating /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.20.0.installed build_shims: [echo] Compiling shims against hadoop 0.20.0 (/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.20.0) [javac] Compiling 2 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/shims/src/0.20/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/shims/src/0.20/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. jar: [echo] Jar: shims [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/hive_shims.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 29ms :: artifacts dl 2ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/3ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: common [javac] Compiling 5 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/classes jar: [echo] Jar: common [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/hive_common.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: dynamic-serde: compile: [echo] Compiling: hive [javac] Compiling 223 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. jar: [echo] Jar: serde [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/hive_serde.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: model-compile: [javac] Compiling 8 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/classes [copy] Copying 1 file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/classes core-compile: [echo] Compiling: [javac] Compiling 31 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/classes [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. model-enhance: [datanucleusenhancer] log4j:WARN No appenders could be found for logger (DataNucleus.Enhancer). [datanucleusenhancer] log4j:WARN Please initialize the log4j system properly. [datanucleusenhancer] DataNucleus Enhancer (version 1.1.2) : Enhancement of classes [datanucleusenhancer] DataNucleus Enhancer : Classpath [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.19.0/hadoop-0.19.0-core.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/hive_anttasks.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/hive_common.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/hive_serde.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/hive_shims.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/asm-3.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-cli-2.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-codec-1.3.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-collections-3.2.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-lang-2.4.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-logging-1.0.4.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-logging-api-1.0.4.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/datanucleus-core-1.1.2.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/datanucleus-enhancer-1.1.2.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/datanucleus-rdbms-1.1.2.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/derby.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/jdo2-api-2.3-SNAPSHOT.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/json.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/libfb303.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/libthrift.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/log4j-1.2.15.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/velocity-1.5.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/antlr-2.7.7.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/antlr-3.0.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/antlr-runtime-3.0.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/stringtemplate-3.1b1.jar [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition [datanucleusenhancer] DataNucleus Enhancer completed with success for 8 classes. Timings : input=367 ms, enhance=198 ms, total=565 ms. Consult the log for full details compile: jar: [echo] Jar: metastore [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/hive_metastore.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: ql-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/gen-java/org/apache/hadoop/hive/ql/parse build-grammar: [echo] Building Grammar /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g .... [java] ANTLR Parser Generator Version 3.0.1 (August 13, 2007) 1989-2007 compile: [echo] Compiling: hive [javac] Compiling 459 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. jar: [echo] Jar: hive [unzip] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/libthrift.jar into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/thrift/classes [unzip] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-lang-2.4.jar into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/commons-lang/classes [unzip] Expanding: /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/json.jar into /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/json/classes [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/hive_exec.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 23ms :: artifacts dl 1ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/3ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: cli [javac] Compiling 3 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/classes [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/cli/src/java/org/apache/hadoop/hive/cli/OptionsProcessor.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. jar: [echo] Jar: cli [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/hive_cli.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#contrib;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 25ms :: artifacts dl 1ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/2ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: contrib [javac] Compiling 24 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/contrib/src/java/org/apache/hadoop/hive/contrib/udf/example/UDFExampleStructPrint.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. jar: [echo] Jar: contrib [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/hive_contrib.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: core-compile: [javac] Compiling 8 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/classes [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/service/src/gen-javabean/org/apache/hadoop/hive/service/ThriftHive.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. compile: jar: [echo] Jar: service [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/hive_service.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: core-compile: [javac] Compiling 10 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. compile: jar: [echo] Jar: jdbc [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/hive_jdbc.jar create-dirs: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/classes [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/src [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/classes compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#hwi;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 21ms :: artifacts dl 2ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#hwi [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/2ms) install-hadoopcore-internal: setup: war: [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/hive_hwi.war compile: [echo] Compiling: hwi [javac] Compiling 6 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/classes jar: [echo] Jar: hwi [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/hive_hwi.jar test: test: [echo] Nothing to do! test: [echo] Nothing to do! test-conditions: gen-test: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: dynamic-serde: compile: [echo] Compiling: hive [javac] Compiling 1 source file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/classes compile-test: [javac] Compiling 19 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/serde/src/test/org/apache/hadoop/hive/serde2/dynamic_type/TestDynamicSerDe.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. test-jar: [jar] Building MANIFEST-only jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/test-udfs.jar test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/test/data test: test-conditions: gen-test: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: model-compile: core-compile: [echo] Compiling: model-enhance: [datanucleusenhancer] log4j:WARN No appenders could be found for logger (DataNucleus.Enhancer). [datanucleusenhancer] log4j:WARN Please initialize the log4j system properly. [datanucleusenhancer] DataNucleus Enhancer (version 1.1.2) : Enhancement of classes [datanucleusenhancer] DataNucleus Enhancer : Classpath [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/classes [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hadoopcore/hadoop-0.19.0/hadoop-0.19.0-core.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/anttasks/hive_anttasks.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/cli/hive_cli.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/common/hive_common.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/hive_contrib.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/hive_hwi.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/hive_jdbc.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/hive_metastore.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/hive_exec.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/serde/hive_serde.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/hive_service.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/shims/hive_shims.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/asm-3.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-cli-2.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-codec-1.3.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-collections-3.2.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-lang-2.4.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-logging-1.0.4.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/commons-logging-api-1.0.4.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/datanucleus-core-1.1.2.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/datanucleus-enhancer-1.1.2.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/datanucleus-rdbms-1.1.2.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/derby.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/jdo2-api-2.3-SNAPSHOT.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/json.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/libfb303.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/libthrift.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/log4j-1.2.15.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/lib/velocity-1.5.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/antlr-2.7.7.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/antlr-3.0.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/antlr-runtime-3.0.1.jar [datanucleusenhancer] >> /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/lib/stringtemplate-3.1b1.jar [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition [datanucleusenhancer] DataNucleus Enhancer completed with success for 8 classes. Timings : input=396 ms, enhance=139 ms, total=535 ms. Consult the log for full details compile: compile-test: [javac] Compiling 1 source file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/classes test-jar: [jar] Building MANIFEST-only jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/test-udfs.jar test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/metastore/test/data test: test-conditions: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/data gen-test: [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:16 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParse.java from template TestParse.vm [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:16 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParseNegative.java from template TestParseNegative.vm [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:16 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/src/org/apache/hadoop/hive/cli/TestCliDriver.java from template TestCliDriver.vm [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:16 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/src/org/apache/hadoop/hive/cli/TestNegativeCliDriver.java from template TestNegativeCliDriver.vm create-dirs: init: ql-init: build-grammar: compile: [echo] Compiling: hive [javac] Compiling 11 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/classes compile-test: [javac] Compiling 25 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/org/apache/hadoop/hive/ql/io/TestFlatFileInputFormat.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] Compiling 4 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/classes test-jar: [jar] Building jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/test-udfs.jar test: [junit] Running org.apache.hadoop.hive.cli.TestNegativeCliDriver [junit] Begin query: script_broken_pipe3.q [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11} [junit] POSTHOOK: Output: default@srcpart@ds=2008-04-08/hr=11 [junit] OK [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12} [junit] POSTHOOK: Output: default@srcpart@ds=2008-04-08/hr=12 [junit] OK [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11} [junit] POSTHOOK: Output: default@srcpart@ds=2008-04-09/hr=11 [junit] OK [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12} [junit] POSTHOOK: Output: default@srcpart@ds=2008-04-09/hr=12 [junit] OK [junit] POSTHOOK: Output: default@srcbucket [junit] OK [junit] Loading data to table srcbucket [junit] POSTHOOK: Output: default@srcbucket [junit] OK [junit] Loading data to table srcbucket [junit] POSTHOOK: Output: default@srcbucket [junit] OK [junit] Loading data to table src [junit] POSTHOOK: Output: default@src [junit] OK [junit] Loading data to table src1 [junit] POSTHOOK: Output: default@src1 [junit] OK [junit] Loading data to table src_sequencefile [junit] POSTHOOK: Output: default@src_sequencefile [junit] OK [junit] Loading data to table src_thrift [junit] POSTHOOK: Output: default@src_thrift [junit] OK [junit] Loading data to table src_json [junit] POSTHOOK: Output: default@src_json [junit] OK [junit] plan = /tmp/plan60153.xml [junit] Number of reduce tasks determined at compile time: 1 [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: Number of reduce tasks determined at compile time: 1 [junit] In order to change the average load for a reducer (in bytes): [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: In order to change the average load for a reducer (in bytes): [junit] set hive.exec.reducers.bytes.per.reducer= [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: set hive.exec.reducers.bytes.per.reducer= [junit] In order to limit the maximum number of reducers: [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: In order to limit the maximum number of reducers: [junit] set hive.exec.reducers.max= [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: set hive.exec.reducers.max= [junit] In order to set a constant number of reducers: [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: In order to set a constant number of reducers: [junit] set mapred.reduce.tasks= [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: set mapred.reduce.tasks= [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: Using org.apache.hadoop.hive.ql.io.HiveInputFormat [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: adding libjars: file:///data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/test-udfs.jar [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: Processing alias tmp:src [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: Adding input file file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/data/warehouse/src [junit] 09/11/25 16:24:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= [junit] 09/11/25 16:24:29 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. [junit] 09/11/25 16:24:29 INFO mapred.FileInputFormat: Total input paths to process : 1 [junit] Job running in-process (local Hadoop) [junit] 09/11/25 16:24:29 INFO exec.ExecDriver: Job running in-process (local Hadoop) [junit] 09/11/25 16:24:29 INFO mapred.FileInputFormat: Total input paths to process : 1 [junit] 09/11/25 16:24:29 INFO mapred.MapTask: numReduceTasks: 1 [junit] 09/11/25 16:24:29 INFO mapred.MapTask: io.sort.mb = 100 [junit] 09/11/25 16:24:29 INFO mapred.MapTask: data buffer = 79691776/99614720 [junit] 09/11/25 16:24:29 INFO mapred.MapTask: record buffer = 262144/327680 [junit] 09/11/25 16:24:30 INFO ExecMapper: maximum memory = 932118528 [junit] 09/11/25 16:24:30 INFO ExecMapper: conf classpath = [file:/tmp/hadoop-pyang/hadoop-unjar18237/, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/hive_exec.jar, file:/tmp/hadoop-pyang/hadoop-unjar18237/classes, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/test-udfs.jar] [junit] 09/11/25 16:24:30 INFO ExecMapper: thread classpath = [file:/tmp/hadoop-pyang/hadoop-unjar18237/, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/hive_exec.jar, file:/tmp/hadoop-pyang/hadoop-unjar18237/classes, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/test-udfs.jar] [junit] 09/11/25 16:24:30 INFO exec.MapOperator: Adding alias tmp:src to work list for file /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/data/warehouse/src/kv1.txt [junit] 09/11/25 16:24:30 INFO exec.MapOperator: dump TS struct [junit] 09/11/25 16:24:30 INFO ExecMapper: [junit] Id =18 [junit] [junit] Id =0 [junit] [junit] Id =1 [junit] [junit] Id =2 [junit] [junit] Id =3 [junit] Id = 2 null<\Parent> [junit] <\RS> [junit] <\Children> [junit] Id = 1 null<\Parent> [junit] <\LIM> [junit] <\Children> [junit] Id = 0 null<\Parent> [junit] <\SEL> [junit] <\Children> [junit] Id = 18 null<\Parent> [junit] <\TS> [junit] <\Children> [junit] <\MAP> [junit] 09/11/25 16:24:30 INFO exec.MapOperator: Initializing Self 18 MAP [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: Initializing Self 0 TS [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: Operator 0 TS initialized [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: Initializing children of 0 TS [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initializing child 1 SEL [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initializing Self 1 SEL [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: SELECT struct [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Operator 1 SEL initialized [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initializing children of 1 SEL [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initializing child 2 LIM [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initializing Self 2 LIM [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Operator 2 LIM initialized [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initializing children of 2 LIM [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: Initializing child 3 RS [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: Initializing Self 3 RS [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: Using tag = -1 [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: Operator 3 RS initialized [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: Initialization Done 3 RS [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initialization Done 2 LIM [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initialization Done 1 SEL [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: Initialization Done 0 TS [junit] 09/11/25 16:24:30 INFO exec.MapOperator: Initialization Done 18 MAP [junit] 09/11/25 16:24:30 INFO exec.MapOperator: 18 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: 0 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 1 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 2 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO ExecMapper: ExecMapper: processing 1 rows: used memory = 120773992 [junit] 09/11/25 16:24:30 INFO exec.MapOperator: 18 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.MapOperator: 18 forwarded 5 rows [junit] 09/11/25 16:24:30 INFO exec.MapOperator: DESERIALIZE_ERRORS:0 [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: 0 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: 0 forwarded 4 rows [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 1 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 1 forwarded 3 rows [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 2 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 2 forwarded 1 rows [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: 3 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.ReduceSinkOperator: 3 forwarded 0 rows [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 2 Close done [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 1 Close done [junit] 09/11/25 16:24:30 INFO exec.TableScanOperator: 0 Close done [junit] 09/11/25 16:24:30 INFO exec.MapOperator: 18 Close done [junit] 09/11/25 16:24:30 INFO ExecMapper: ExecMapper: processed 5 rows: used memory = 120773992 [junit] 09/11/25 16:24:30 INFO mapred.MapTask: Starting flush of map output [junit] 09/11/25 16:24:30 INFO mapred.MapTask: Finished spill 0 [junit] 09/11/25 16:24:30 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting [junit] 09/11/25 16:24:30 INFO mapred.LocalJobRunner: file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812 [junit] 09/11/25 16:24:30 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done. [junit] 09/11/25 16:24:30 INFO ExecReducer: maximum memory = 932118528 [junit] 09/11/25 16:24:30 INFO ExecReducer: conf classpath = [file:/tmp/hadoop-pyang/hadoop-unjar18237/, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/hive_exec.jar, file:/tmp/hadoop-pyang/hadoop-unjar18237/classes, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/test-udfs.jar] [junit] 09/11/25 16:24:30 INFO ExecReducer: thread classpath = [file:/tmp/hadoop-pyang/hadoop-unjar18237/, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/hive_exec.jar, file:/tmp/hadoop-pyang/hadoop-unjar18237/classes, file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/test-udfs.jar] [junit] 09/11/25 16:24:30 INFO ExecReducer: [junit] Id =4 [junit] [junit] Id =5 [junit] [junit] Id =6 [junit] [junit] Id =7 [junit] [junit] Id =8 [junit] Id = 7 null<\Parent> [junit] <\FS> [junit] <\Children> [junit] Id = 6 null<\Parent> [junit] <\SCR> [junit] <\Children> [junit] Id = 5 null<\Parent> [junit] <\SEL> [junit] <\Children> [junit] Id = 4 null<\Parent> [junit] <\LIM> [junit] <\Children> [junit] <\OP> [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: Initializing Self 4 OP [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: Operator 4 OP initialized [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: Initializing children of 4 OP [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initializing child 5 LIM [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initializing Self 5 LIM [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Operator 5 LIM initialized [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initializing children of 5 LIM [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initializing child 6 SEL [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initializing Self 6 SEL [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: SELECT struct<_col0:string,_col1:string> [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Operator 6 SEL initialized [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initializing children of 6 SEL [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: Initializing child 7 SCR [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: Initializing Self 7 SCR [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: Executing [/bin/false] [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: tablename=null [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: partname=null [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: alias=null [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: Operator 7 SCR initialized [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: Initializing children of 7 SCR [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: Initializing child 8 FS [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: Initializing Self 8 FS [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: Writing to temp file: FS file:/data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/tmp/283253683/_tmp.10001/_tmp.attempt_local_0001_r_000000_0 [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: Operator 8 FS initialized [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: Initialization Done 8 FS [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: Initialization Done 7 SCR [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: Initialization Done 6 SEL [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: Initialization Done 5 LIM [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: Initialization Done 4 OP [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: StreamThread OutputProcessor done [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: StreamThread ErrorProcessor done [junit] 09/11/25 16:24:30 INFO mapred.Merger: Merging 1 sorted segments [junit] 09/11/25 16:24:30 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 26 bytes [junit] 09/11/25 16:24:30 INFO ExecReducer: ExecReducer: processing 1 rows: used memory = 9250384 [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: 4 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 5 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 6 forwarding 1 rows [junit] 09/11/25 16:24:30 INFO ExecReducer: ExecReducer: processed 1 rows: used memory = 9250384 [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: 4 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.ExtractOperator: 4 forwarded 1 rows [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 5 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.LimitOperator: 5 forwarded 1 rows [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 6 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.SelectOperator: 6 forwarded 1 rows [junit] 09/11/25 16:24:30 WARN exec.ScriptOperator: Got broken pipe: ignoring exception [junit] 09/11/25 16:24:30 ERROR exec.ScriptOperator: Script failed with code 1 [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: 7 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: 7 forwarded 0 rows [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0 [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0 [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: 8 finished. closing... [junit] 09/11/25 16:24:30 INFO exec.FileSinkOperator: 8 forwarded 0 rows [junit] 09/11/25 16:24:30 INFO exec.ScriptOperator: 7 Close done [junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing .. [junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:382) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:258) [junit] at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:440) [junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:170) [junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing .. [junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:382) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:258) [junit] at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:440) [junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:170) [junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing .. [junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:382) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:258) [junit] at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:440) [junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:170) [junit] 09/11/25 16:24:30 ERROR ExecReducer: Hit error while closing operators - failing tree [junit] 09/11/25 16:24:30 WARN mapred.LocalJobRunner: job_local_0001 [junit] java.lang.RuntimeException: Error while closing operators: Hit error while closing .. [junit] at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:266) [junit] at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:440) [junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:170) [junit] Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing .. [junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:382) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:452) [junit] at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:258) [junit] ... 2 more [junit] 2009-11-25 04:24:30,825 map = 100%, reduce = 0% [junit] 09/11/25 16:24:30 INFO exec.ExecDriver: 2009-11-25 04:24:30,825 map = 100%, reduce = 0% [junit] Ended Job = job_local_0001 with errors [junit] 09/11/25 16:24:30 ERROR exec.ExecDriver: Ended Job = job_local_0001 with errors [junit] Job Failed [junit] diff -a -I \(file:\)\|\(/tmp/.*\) -I lastUpdateTime -I lastAccessTime -I owner -I transient_lastDdlTime /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/ql/test/logs/clientnegative/script_broken_pipe3.q.out /data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/results/clientnegative/script_broken_pipe3.q.out [junit] Done query: script_broken_pipe3.q [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 12.836 sec test: [echo] Nothing to do! test-conditions: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/data gen-test: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src/org/apache/hadoop/hive/ql/parse [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src/org/apache/hadoop/hive/cli [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/contribpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/contribnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/contribclientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/logs/contribclientnegative [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:31 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src/org/apache/hadoop/hive/ql/parse/TestContribParse.java from template TestParse.vm [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:31 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src/org/apache/hadoop/hive/ql/parse/TestContribParseNegative.java from template TestParseNegative.vm [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:31 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src/org/apache/hadoop/hive/cli/TestContribCliDriver.java from template TestCliDriver.vm [qtestgen] Template Path:/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates [qtestgen] Nov 25, 2009 4:24:31 PM org.apache.velocity.runtime.log.JdkLogChute log [qtestgen] INFO: FileResourceLoader : adding path '/data/users/pyang/test/trunk/VENDOR.hive/trunk/ql/src/test/templates' [qtestgen] Generated /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/src/org/apache/hadoop/hive/cli/TestContribNegativeCliDriver.java from template TestNegativeCliDriver.vm create-dirs: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#contrib;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 23ms :: artifacts dl 1ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#contrib [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/3ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: contrib compile-test: [javac] Compiling 1 source file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/classes [javac] Compiling 4 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/classes jar: [echo] Jar: contrib test-jar: [jar] Building MANIFEST-only jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/contrib/test/test-udfs.jar test: test-conditions: gen-test: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: core-compile: compile: compile-test: [javac] Compiling 1 source file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/classes test-jar: [jar] Building MANIFEST-only jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/test-udfs.jar test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/service/test/data test: test-conditions: gen-test: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: core-compile: compile: compile-test: [javac] Compiling 1 source file to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/classes test-jar: [jar] Building MANIFEST-only jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/test-udfs.jar test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/jdbc/test/data test: test-conditions: gen-test: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = /data/users/pyang/test/trunk/VENDOR.hive/trunk/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#hwi;working@dev221.snc1.facebook.com [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 21ms :: artifacts dl 1ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#hwi [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/3ms) install-hadoopcore-internal: setup: war: compile: [echo] Compiling: hwi compile-test: [javac] Compiling 2 source files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/classes test-jar: [jar] Building MANIFEST-only jar: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/test-udfs.jar test-init: [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/data [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/logs/clientpositive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/logs/clientnegative [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/logs/positive [mkdir] Created dir: /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/logs/negative [copy] Copying 41 files to /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/data [copy] Copied 6 empty directories to 2 empty directories under /data/users/pyang/test/trunk/VENDOR.hive/trunk/build/hwi/test/data test: check-word-size: check-thrift-home: test: BUILD SUCCESSFUL Total time: 51 seconds