Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
HIVE-3140 changed behavior of 'DESCRIBE table;' to be like 'DESCRIBE FORMATTED table;'. HIVE-3140 introduced changes to not print header in 'DESCRIBE table;'. But jdbc/odbc calls still get fields padded with space for the 'DESCRIBE table;' query.
As the jdbc/odbc results are not for direct human consumption the space padding should not be done for hive server2.
Attachments
Attachments
- HIVE-4545.2.patch
- 20 kB
- Thejas Nair
- HIVE-4545.3.patch
- 21 kB
- Thejas Nair
- HIVE-4545.4.patch
- 19 kB
- Thejas Nair
- HIVE-4545.5.patch
- 105 kB
- Vaibhav Gumashta
- HIVE-4545-1.patch
- 22 kB
- Thejas Nair
Issue Links
Activity
HIVE-4545.3.patch - updates test case to remove .trim() before comparison in two more places.
Overall: -1 no tests executed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12596693/HIVE-4545.3.patch
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/332/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/332/console
Messages:
Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-332/source-prep.txt + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . svn: Working copy '.' locked svn: run 'svn cleanup' to remove locks (type 'svn help cleanup' for details) + exit 1 '
This message is automatically generated.
The build process was stuck on a build and it looks like I killed this one by accident. I'll kick off a build for this again.
Overall: -1 no tests executed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12596693/HIVE-4545.3.patch
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/333/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/333/console
Messages:
Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-333/source-prep.txt + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . Reverted 'jdbc/src/test/org/apache/hive/jdbc/TestJdbcDriver2.java' Reverted 'common/src/java/org/apache/hadoop/hive/conf/HiveConf.java' Reverted 'service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/MetaDataFormatter.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/TextMetaDataFormatter.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/MetaDataFormatUtils.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/JsonMetaDataFormatter.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java' ++ awk '{print $2}' ++ egrep -v '^X|^Performing status on external' ++ svn status --no-ignore + rm -rf build common/src/java/org/apache/hadoop/hive/conf/HiveConf.java.orig + svn update Fetching external item into 'hcatalog/src/test/e2e/harness' External at revision 1511908. At revision 1511907. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch Going to apply patch with: patch -p1 patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java Hunk #1 succeeded at 723 (offset 1 line). patching file jdbc/src/test/org/apache/hive/jdbc/TestJdbcDriver2.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/JsonMetaDataFormatter.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/MetaDataFormatUtils.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/MetaDataFormatter.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/TextMetaDataFormatter.java patching file service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java + [[ true == \t\r\u\e ]] + rm -rf /data/hive-ptest/working/ivy /data/hive-ptest/working/maven + mkdir /data/hive-ptest/working/ivy /data/hive-ptest/working/maven + ant -Dtest.continue.on.failure=true -Dtest.silent=false -Divy.default.ivy.user.dir=/data/hive-ptest/working/ivy -Dmvn.local.repo=/data/hive-ptest/working/maven clean package test -Dtestcase=nothing Buildfile: /data/hive-ptest/working/apache-svn-trunk-source/build.xml clean: [echo] Project: hive clean: [echo] Project: anttasks clean: [echo] Project: shims clean: [echo] Project: common clean: [echo] Project: serde clean: [echo] Project: metastore clean: [echo] Project: ql clean: [echo] Project: contrib clean: [echo] Project: service clean: [echo] Project: cli clean: [echo] Project: jdbc clean: [echo] Project: beeline clean: [echo] Project: hwi clean: [echo] Project: hbase-handler clean: [echo] Project: testutils clean: [echo] hcatalog clean: [echo] hcatalog-core clean: [echo] hcatalog-pig-adapter clean: [echo] hcatalog-server-extensions clean: [echo] webhcat clean: [echo] webhcat-java-client clean: clean: [echo] shims clean: [echo] Project: odbc [exec] rm -rf /data/hive-ptest/working/apache-svn-trunk-source/build/odbc /data/hive-ptest/working/apache-svn-trunk-source/build/service/objs /data/hive-ptest/working/apache-svn-trunk-source/build/ql/objs /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/objs clean-online: [echo] Project: hive clean-offline: ivy-init-dirs: [echo] Project: hive [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ivy [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/maven ivy-download: [echo] Project: hive [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/ivy-2.1.0.jar ivy-probe-antlib: [echo] Project: hive ivy-init-antlib: [echo] Project: hive compile-ant-tasks: [echo] Project: hive create-dirs: [echo] Project: anttasks [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jexl/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hadoopcore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources does not exist. init: [echo] Project: anttasks ivy-init-settings: [echo] Project: anttasks ivy-resolve: [echo] Project: anttasks [ivy:resolve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-anttasks;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found velocity#velocity;1.5 in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-lang/commons-lang/2.4/commons-lang-2.4.jar ... [ivy:resolve] ..... (255kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-lang#commons-lang;2.4!commons-lang.jar (17ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/velocity/velocity/1.5/velocity-1.5.jar ... [ivy:resolve] ....... (382kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] velocity#velocity;1.5!velocity.jar (16ms) [ivy:resolve] :: resolution report :: resolve 4070ms :: artifacts dl 49ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 2 | 2 | 2 | 0 || 2 | 2 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-anttasks-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-anttasks-default.html ivy-retrieve: [echo] Project: anttasks [ivy:retrieve] :: retrieving :: org.apache.hive#hive-anttasks [ivy:retrieve] confs: [default] [ivy:retrieve] 2 artifacts copied, 0 already retrieved (638kB/6ms) compile: [echo] anttasks [javac] /data/hive-ptest/working/apache-svn-trunk-source/ant/build.xml:38: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds [javac] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/classes [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. deploy-ant-tasks: [echo] Project: hive create-dirs: [echo] Project: anttasks [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources does not exist. init: [echo] Project: anttasks ivy-init-settings: [echo] Project: anttasks ivy-resolve: [echo] Project: anttasks [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-anttasks;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found velocity#velocity;1.5 in maven2 [ivy:resolve] :: resolution report :: resolve 102ms :: artifacts dl 2ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 2 | 0 | 0 | 0 || 2 | 0 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-anttasks-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-anttasks-default.html ivy-retrieve: [echo] Project: anttasks [ivy:retrieve] :: retrieving :: org.apache.hive#hive-anttasks [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 2 already retrieved (0kB/9ms) compile: [echo] anttasks [javac] /data/hive-ptest/working/apache-svn-trunk-source/ant/build.xml:38: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds jar: [echo] anttasks [copy] Copying 1 file to /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/classes/org/apache/hadoop/hive/ant [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/hive-anttasks-0.12.0-SNAPSHOT.jar init: [echo] Project: hive create-dirs: [echo] Project: anttasks [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources does not exist. init: [echo] Project: anttasks create-dirs: [echo] Project: shims [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/shims [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/shims/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/shims/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/shims/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/shims/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/shims/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/shims/src/test/resources does not exist. init: [echo] Project: shims create-dirs: [echo] Project: common [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/common [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/common/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/common/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/common/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/common/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/common/test/resources [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/build/common/test/resources init: [echo] Project: common create-dirs: [echo] Project: serde [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/serde [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/serde/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/serde/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/serde/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/serde/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/serde/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources does not exist. init: [echo] Project: serde create-dirs: [echo] Project: metastore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources does not exist. init: [echo] Project: metastore create-dirs: [echo] Project: ql [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql/test/resources [copy] Copying 2 files to /data/hive-ptest/working/apache-svn-trunk-source/build/ql/test/resources init: [echo] Project: ql create-dirs: [echo] Project: contrib [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/contrib [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/contrib/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/contrib/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/contrib/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/contrib/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/contrib/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources does not exist. init: [echo] Project: contrib create-dirs: [echo] Project: service [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/service [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/service/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/service/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/service/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/service/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/service/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources does not exist. init: [echo] Project: service create-dirs: [echo] Project: cli [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/cli [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/cli/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/cli/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/cli/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/cli/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/cli/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources does not exist. init: [echo] Project: cli create-dirs: [echo] Project: jdbc [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/test/resources does not exist. init: [echo] Project: jdbc create-dirs: [echo] Project: beeline [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/beeline [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/beeline/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/beeline/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/beeline/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/beeline/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/beeline/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/test/resources does not exist. init: [echo] Project: beeline create-dirs: [echo] Project: hwi [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hwi [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hwi/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hwi/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hwi/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hwi/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hwi/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources does not exist. init: [echo] Project: hwi create-dirs: [echo] Project: hbase-handler [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources does not exist. init: [echo] Project: hbase-handler create-dirs: [echo] Project: testutils [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/testutils [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/testutils/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/testutils/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/testutils/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/testutils/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/testutils/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources does not exist. init: [echo] Project: testutils init: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/build/hcatalog-0.12.0-SNAPSHOT jar: [echo] Project: hive ivy-init-settings: [echo] Project: shims check-ivy: [echo] Project: shims ivy-resolve: [echo] Project: shims [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found org.apache.zookeeper#zookeeper;3.4.3 in maven2 [ivy:resolve] found org.apache.thrift#libthrift;0.9.0 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2 [ivy:resolve] found commons-logging#commons-logging-api;1.0.4 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-core-asl;1.8.8 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-mapper-asl;1.8.8 in maven2 [ivy:resolve] found log4j#log4j;1.2.16 in maven2 [ivy:resolve] found com.google.guava#guava;11.0.2 in maven2 [ivy:resolve] found commons-io#commons-io;2.4 in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/zookeeper/zookeeper/3.4.3/zookeeper-3.4.3.jar ... [ivy:resolve] ............... (749kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.zookeeper#zookeeper;3.4.3!zookeeper.jar (37ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar ... [ivy:resolve] ....... (339kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.thrift#libthrift;0.9.0!libthrift.jar (28ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-logging/commons-logging/1.0.4/commons-logging-1.0.4.jar ... [ivy:resolve] .. (37kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-logging#commons-logging;1.0.4!commons-logging.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-logging/commons-logging-api/1.0.4/commons-logging-api-1.0.4.jar ... [ivy:resolve] .. (25kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-logging#commons-logging-api;1.0.4!commons-logging-api.jar (28ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar ... [ivy:resolve] ..... (222kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jackson#jackson-core-asl;1.8.8!jackson-core-asl.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar ... [ivy:resolve] ............ (652kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jackson#jackson-mapper-asl;1.8.8!jackson-mapper-asl.jar (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/log4j/log4j/1.2.16/log4j-1.2.16.jar ... [ivy:resolve] ......... (470kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] log4j#log4j;1.2.16!log4j.jar(bundle) (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/guava/guava/11.0.2/guava-11.0.2.jar ... [ivy:resolve] ............................ (1609kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.guava#guava;11.0.2!guava.jar (49ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-io/commons-io/2.4/commons-io-2.4.jar ... [ivy:resolve] .... (180kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-io#commons-io;2.4!commons-io.jar (25ms) [ivy:resolve] :: resolution report :: resolve 8139ms :: artifacts dl 315ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 9 | 9 | 9 | 0 || 9 | 9 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-shims-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-shims-default.html make-pom: [echo] Project: shims [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-source/build/shims/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:makepom] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: shims [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/shims/src/test/resources does not exist. init: [echo] Project: shims ivy-retrieve: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] confs: [default] [ivy:retrieve] 9 artifacts copied, 0 already retrieved (4287kB/35ms) compile: [echo] Project: shims [echo] Building shims 0.20 build-shims: [echo] Project: shims [echo] Compiling /data/hive-ptest/working/apache-svn-trunk-source/shims/src/common/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/src/0.20/java against hadoop 0.20.2 (/data/hive-ptest/working/apache-svn-trunk-source/build/hadoopcore/hadoop-0.20.2) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.12.0-SNAPSHOT [ivy:resolve] confs: [hadoop0.20.shim] [ivy:resolve] found org.apache.hadoop#hadoop-core;0.20.2 in maven2 [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] found commons-httpclient#commons-httpclient;3.0.1 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.0.3 in maven2 [ivy:resolve] found commons-codec#commons-codec;1.3 in maven2 [ivy:resolve] found commons-net#commons-net;1.4.1 in maven2 [ivy:resolve] found oro#oro;2.0.8 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty;6.1.14 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.14 in maven2 [ivy:resolve] found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2 [ivy:resolve] found tomcat#jasper-runtime;5.5.12 in maven2 [ivy:resolve] found tomcat#jasper-compiler;5.5.12 in maven2 [ivy:resolve] found org.mortbay.jetty#jsp-api-2.1;6.1.14 in maven2 [ivy:resolve] found org.mortbay.jetty#jsp-2.1;6.1.14 in maven2 [ivy:resolve] found org.eclipse.jdt#core;3.1.1 in maven2 [ivy:resolve] found ant#ant;1.6.5 in maven2 [ivy:resolve] found commons-el#commons-el;1.0 in maven2 [ivy:resolve] found net.java.dev.jets3t#jets3t;0.7.1 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] found net.sf.kosmosfs#kfs;0.3 in maven2 [ivy:resolve] found junit#junit;4.5 in maven2 [ivy:resolve] found hsqldb#hsqldb;1.8.0.10 in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-tools;0.20.2 in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-test;0.20.2 in maven2 [ivy:resolve] found org.apache.ftpserver#ftplet-api;1.0.0 in maven2 [ivy:resolve] found org.apache.mina#mina-core;2.0.0-M5 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.5.2 in maven2 [ivy:resolve] found org.apache.ftpserver#ftpserver-core;1.0.0 in maven2 [ivy:resolve] found org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-core/0.20.2/hadoop-core-0.20.2.jar ... [ivy:resolve] ............................................ (2624kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-core;0.20.2!hadoop-core.jar (65ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-tools/0.20.2/hadoop-tools-0.20.2.jar ... [ivy:resolve] ... (68kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-tools;0.20.2!hadoop-tools.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-test/0.20.2/hadoop-test-0.20.2.jar ... [ivy:resolve] ........................... (1527kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-test;0.20.2!hadoop-test.jar (47ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-cli/commons-cli/1.2/commons-cli-1.2.jar ... [ivy:resolve] .. (40kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-cli#commons-cli;1.2!commons-cli.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/xmlenc/xmlenc/0.52/xmlenc-0.52.jar ... [ivy:resolve] .. (14kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] xmlenc#xmlenc;0.52!xmlenc.jar (21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar ... [ivy:resolve] ...... (273kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-httpclient#commons-httpclient;3.0.1!commons-httpclient.jar (35ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-codec/commons-codec/1.3/commons-codec-1.3.jar ... [ivy:resolve] .. (45kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-codec#commons-codec;1.3!commons-codec.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar ... [ivy:resolve] .... (176kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-net#commons-net;1.4.1!commons-net.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jetty/6.1.14/jetty-6.1.14.jar ... [ivy:resolve] ......... (504kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#jetty;6.1.14!jetty.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jetty-util/6.1.14/jetty-util-6.1.14.jar ... [ivy:resolve] .... (159kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#jetty-util;6.1.14!jetty-util.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar ... [ivy:resolve] ... (74kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] tomcat#jasper-runtime;5.5.12!jasper-runtime.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar ... [ivy:resolve] ........ (395kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] tomcat#jasper-compiler;5.5.12!jasper-compiler.jar (28ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar ... [ivy:resolve] .... (131kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#jsp-api-2.1;6.1.14!jsp-api-2.1.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar ... [ivy:resolve] .................. (1000kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#jsp-2.1;6.1.14!jsp-2.1.jar (46ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-el/commons-el/1.0/commons-el-1.0.jar ... [ivy:resolve] ... (109kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-el#commons-el;1.0!commons-el.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/net/java/dev/jets3t/jets3t/0.7.1/jets3t-0.7.1.jar ... [ivy:resolve] ....... (368kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] net.java.dev.jets3t#jets3t;0.7.1!jets3t.jar (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar ... [ivy:resolve] .... (129kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#servlet-api-2.5;6.1.14!servlet-api-2.5.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/net/sf/kosmosfs/kfs/0.3/kfs-0.3.jar ... [ivy:resolve] .. (11kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] net.sf.kosmosfs#kfs;0.3!kfs.jar (21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/junit/junit/4.5/junit-4.5.jar ... [ivy:resolve] ..... (194kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] junit#junit;4.5!junit.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar ... [ivy:resolve] ............ (690kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] hsqldb#hsqldb;1.8.0.10!hsqldb.jar (32ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/oro/oro/2.0.8/oro-2.0.8.jar ... [ivy:resolve] .. (63kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] oro#oro;2.0.8!oro.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar ... [ivy:resolve] .................................................................. (3483kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.eclipse.jdt#core;3.1.1!core.jar (82ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/ant/ant/1.6.5/ant-1.6.5.jar ... [ivy:resolve] .................. (1009kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] ant#ant;1.6.5!ant.jar (39ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar ... [ivy:resolve] .. (59kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-logging#commons-logging;1.1.1!commons-logging.jar (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/ftpserver/ftplet-api/1.0.0/ftplet-api-1.0.0.jar ... [ivy:resolve] .. (22kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.ftpserver#ftplet-api;1.0.0!ftplet-api.jar(bundle) (21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/mina/mina-core/2.0.0-M5/mina-core-2.0.0-M5.jar ... [ivy:resolve] ........... (622kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.mina#mina-core;2.0.0-M5!mina-core.jar(bundle) (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/ftpserver/ftpserver-core/1.0.0/ftpserver-core-1.0.0.jar ... [ivy:resolve] ...... (264kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.ftpserver#ftpserver-core;1.0.0!ftpserver-core.jar(bundle) (26ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/ftpserver/ftpserver-deprecated/1.0.0-M2/ftpserver-deprecated-1.0.0-M2.jar ... [ivy:resolve] .. (31kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2!ftpserver-deprecated.jar (21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.5.2/slf4j-api-1.5.2.jar ... [ivy:resolve] .. (16kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.slf4j#slf4j-api;1.5.2!slf4j-api.jar (22ms) [ivy:resolve] :: resolution report :: resolve 30307ms :: artifacts dl 957ms [ivy:resolve] :: evicted modules: [ivy:resolve] junit#junit;3.8.1 by [junit#junit;4.5] in [hadoop0.20.shim] [ivy:resolve] commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [hadoop0.20.shim] [ivy:resolve] commons-codec#commons-codec;1.2 by [commons-codec#commons-codec;1.3] in [hadoop0.20.shim] [ivy:resolve] commons-httpclient#commons-httpclient;3.1 by [commons-httpclient#commons-httpclient;3.0.1] in [hadoop0.20.shim] [ivy:resolve] org.apache.mina#mina-core;2.0.0-M4 by [org.apache.mina#mina-core;2.0.0-M5] in [hadoop0.20.shim] [ivy:resolve] org.apache.ftpserver#ftplet-api;1.0.0-M2 by [org.apache.ftpserver#ftplet-api;1.0.0] in [hadoop0.20.shim] [ivy:resolve] org.apache.ftpserver#ftpserver-core;1.0.0-M2 by [org.apache.ftpserver#ftpserver-core;1.0.0] in [hadoop0.20.shim] [ivy:resolve] org.apache.mina#mina-core;2.0.0-M2 by [org.apache.mina#mina-core;2.0.0-M5] in [hadoop0.20.shim] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | hadoop0.20.shim | 37 | 30 | 30 | 8 || 29 | 29 | --------------------------------------------------------------------- ivy-retrieve-hadoop-shim: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] confs: [hadoop0.20.shim] [ivy:retrieve] 29 artifacts copied, 0 already retrieved (14115kB/90ms) [javac] Compiling 17 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/shims/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/src/0.20/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [echo] Building shims 0.20S build-shims: [echo] Project: shims [echo] Compiling /data/hive-ptest/working/apache-svn-trunk-source/shims/src/common/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/src/common-secure/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/src/0.20S/java against hadoop 1.1.2 (/data/hive-ptest/working/apache-svn-trunk-source/build/hadoopcore/hadoop-1.1.2) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.12.0-SNAPSHOT [ivy:resolve] confs: [hadoop0.20S.shim] [ivy:resolve] found org.apache.hadoop#hadoop-core;1.1.2 in maven2 [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] found com.sun.jersey#jersey-core;1.8 in maven2 [ivy:resolve] found com.sun.jersey#jersey-json;1.8 in maven2 [ivy:resolve] found org.codehaus.jettison#jettison;1.1 in maven2 [ivy:resolve] found stax#stax-api;1.0.1 in maven2 [ivy:resolve] found com.sun.xml.bind#jaxb-impl;2.2.3-1 in maven2 [ivy:resolve] found javax.xml.bind#jaxb-api;2.2.2 in maven2 [ivy:resolve] found javax.xml.stream#stax-api;1.0-2 in maven2 [ivy:resolve] found javax.activation#activation;1.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-core-asl;1.7.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-mapper-asl;1.7.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-jaxrs;1.7.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-xc;1.7.1 in maven2 [ivy:resolve] found com.sun.jersey#jersey-server;1.8 in maven2 [ivy:resolve] found asm#asm;3.1 in maven2 [ivy:resolve] found commons-io#commons-io;2.1 in maven2 [ivy:resolve] found commons-httpclient#commons-httpclient;3.0.1 in maven2 [ivy:resolve] found junit#junit;3.8.1 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.0.3 in maven2 [ivy:resolve] found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] found org.apache.commons#commons-math;2.1 in maven2 [ivy:resolve] found commons-configuration#commons-configuration;1.6 in maven2 [ivy:resolve] found commons-collections#commons-collections;3.2.1 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] found commons-digester#commons-digester;1.8 in maven2 [ivy:resolve] found commons-beanutils#commons-beanutils;1.7.0 in maven2 [ivy:resolve] found commons-beanutils#commons-beanutils-core;1.8.0 in maven2 [ivy:resolve] found commons-net#commons-net;1.4.1 in maven2 [ivy:resolve] found oro#oro;2.0.8 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty;6.1.26 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.26 in maven2 [ivy:resolve] found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2 [ivy:resolve] found tomcat#jasper-runtime;5.5.12 in maven2 [ivy:resolve] found tomcat#jasper-compiler;5.5.12 in maven2 [ivy:resolve] found org.mortbay.jetty#jsp-api-2.1;6.1.14 in maven2 [ivy:resolve] found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2 [ivy:resolve] found org.mortbay.jetty#jsp-2.1;6.1.14 in maven2 [ivy:resolve] found org.eclipse.jdt#core;3.1.1 in maven2 [ivy:resolve] found ant#ant;1.6.5 in maven2 [ivy:resolve] found commons-el#commons-el;1.0 in maven2 [ivy:resolve] found net.java.dev.jets3t#jets3t;0.6.1 in maven2 [ivy:resolve] found hsqldb#hsqldb;1.8.0.10 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-mapper-asl;1.8.8 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-core-asl;1.8.8 in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-tools;1.1.2 in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-test;1.1.2 in maven2 [ivy:resolve] found org.apache.ftpserver#ftplet-api;1.0.0 in maven2 [ivy:resolve] found org.apache.mina#mina-core;2.0.0-M5 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.5.2 in maven2 [ivy:resolve] found org.apache.ftpserver#ftpserver-core;1.0.0 in maven2 [ivy:resolve] found org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-core/1.1.2/hadoop-core-1.1.2.jar ... [ivy:resolve] .............................................................................. (3941kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-core;1.1.2!hadoop-core.jar (76ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-tools/1.1.2/hadoop-tools-1.1.2.jar ... [ivy:resolve] ...... (299kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-tools;1.1.2!hadoop-tools.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-test/1.1.2/hadoop-test-1.1.2.jar ... [ivy:resolve] .............................................. (2712kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-test;1.1.2!hadoop-test.jar (93ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jersey-core/1.8/jersey-core-1.8.jar ... [ivy:resolve] ........ (447kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.sun.jersey#jersey-core;1.8!jersey-core.jar(bundle) (20ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jersey-json/1.8/jersey-json-1.8.jar ... [ivy:resolve] .... (144kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.sun.jersey#jersey-json;1.8!jersey-json.jar(bundle) (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jersey-server/1.8/jersey-server-1.8.jar ... [ivy:resolve] ............. (678kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.sun.jersey#jersey-server;1.8!jersey-server.jar(bundle) (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-io/commons-io/2.1/commons-io-2.1.jar ... [ivy:resolve] .... (159kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-io#commons-io;2.1!commons-io.jar (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-codec/commons-codec/1.4/commons-codec-1.4.jar ... [ivy:resolve] .. (56kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-codec#commons-codec;1.4!commons-codec.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/commons/commons-math/2.1/commons-math-2.1.jar ... [ivy:resolve] .............. (812kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.commons#commons-math;2.1!commons-math.jar (40ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar ... [ivy:resolve] ...... (291kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-configuration#commons-configuration;1.6!commons-configuration.jar (38ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar ... [ivy:resolve] .......... (527kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#jetty;6.1.26!jetty.jar (19ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar ... [ivy:resolve] .... (172kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#jetty-util;6.1.26!jetty-util.jar (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar ... [ivy:resolve] ...... (314kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] net.java.dev.jets3t#jets3t;0.6.1!jets3t.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar ... [ivy:resolve] ... (66kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jettison#jettison;1.1!jettison.jar(bundle) (18ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar ... [ivy:resolve] ............... (869kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.sun.xml.bind#jaxb-impl;2.2.3-1!jaxb-impl.jar (44ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackson/jackson-jaxrs/1.7.1/jackson-jaxrs-1.7.1.jar ... [ivy:resolve] .. (17kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jackson#jackson-jaxrs;1.7.1!jackson-jaxrs.jar (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackson/jackson-xc/1.7.1/jackson-xc-1.7.1.jar ... [ivy:resolve] .. (30kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jackson#jackson-xc;1.7.1!jackson-xc.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/stax/stax-api/1.0.1/stax-api-1.0.1.jar ... [ivy:resolve] .. (25kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] stax#stax-api;1.0.1!stax-api.jar (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar ... [ivy:resolve] ... (102kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.xml.bind#jaxb-api;2.2.2!jaxb-api.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar ... [ivy:resolve] .. (22kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.xml.stream#stax-api;1.0-2!stax-api.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/activation/activation/1.1/activation-1.1.jar ... [ivy:resolve] .. (61kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.activation#activation;1.1!activation.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/asm/asm/3.1/asm-3.1.jar ... [ivy:resolve] .. (42kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] asm#asm;3.1!asm.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/junit/junit/3.8.1/junit-3.8.1.jar ... [ivy:resolve] ... (118kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] junit#junit;3.8.1!junit.jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar ... [ivy:resolve] .......... (561kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-collections#commons-collections;3.2.1!commons-collections.jar (19ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-digester/commons-digester/1.8/commons-digester-1.8.jar ... [ivy:resolve] .... (140kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-digester#commons-digester;1.8!commons-digester.jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar ... [ivy:resolve] ..... (201kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-beanutils#commons-beanutils-core;1.8.0!commons-beanutils-core.jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar ... [ivy:resolve] .... (184kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-beanutils#commons-beanutils;1.7.0!commons-beanutils.jar (14ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar ... [ivy:resolve] .... (130kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mortbay.jetty#servlet-api;2.5-20081211!servlet-api.jar (33ms) [ivy:resolve] :: resolution report :: resolve 24830ms :: artifacts dl 742ms [ivy:resolve] :: evicted modules: [ivy:resolve] org.codehaus.jackson#jackson-core-asl;1.7.1 by [org.codehaus.jackson#jackson-core-asl;1.8.8] in [hadoop0.20S.shim] [ivy:resolve] org.codehaus.jackson#jackson-mapper-asl;1.7.1 by [org.codehaus.jackson#jackson-mapper-asl;1.8.8] in [hadoop0.20S.shim] [ivy:resolve] commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [hadoop0.20S.shim] [ivy:resolve] commons-codec#commons-codec;1.2 by [commons-codec#commons-codec;1.4] in [hadoop0.20S.shim] [ivy:resolve] commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [hadoop0.20S.shim] [ivy:resolve] commons-codec#commons-codec;1.3 by [commons-codec#commons-codec;1.4] in [hadoop0.20S.shim] [ivy:resolve] commons-httpclient#commons-httpclient;3.1 by [commons-httpclient#commons-httpclient;3.0.1] in [hadoop0.20S.shim] [ivy:resolve] org.apache.mina#mina-core;2.0.0-M4 by [org.apache.mina#mina-core;2.0.0-M5] in [hadoop0.20S.shim] [ivy:resolve] org.apache.ftpserver#ftplet-api;1.0.0-M2 by [org.apache.ftpserver#ftplet-api;1.0.0] in [hadoop0.20S.shim] [ivy:resolve] org.apache.ftpserver#ftpserver-core;1.0.0-M2 by [org.apache.ftpserver#ftpserver-core;1.0.0] in [hadoop0.20S.shim] [ivy:resolve] org.apache.mina#mina-core;2.0.0-M2 by [org.apache.mina#mina-core;2.0.0-M5] in [hadoop0.20S.shim] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | hadoop0.20S.shim | 62 | 30 | 30 | 11 || 51 | 28 | --------------------------------------------------------------------- ivy-retrieve-hadoop-shim: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] confs: [hadoop0.20S.shim] [ivy:retrieve] 51 artifacts copied, 0 already retrieved (22876kB/92ms) [javac] Compiling 14 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/shims/classes [javac] /data/hive-ptest/working/apache-svn-trunk-source/shims/src/common-secure/java/org/apache/hadoop/hive/thrift/DBTokenStore.java:35: warning: non-varargs call of varargs method with inexact argument type for last parameter; [javac] cast to java.lang.Class<?> for a varargs call [javac] cast to java.lang.Class<?>[] for a non-varargs call and to suppress this warning [javac] return (String[])invokeOnRawStore("getMasterKeys", null, null); [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/shims/src/common-secure/java/org/apache/hadoop/hive/thrift/DBTokenStore.java:78: warning: non-varargs call of varargs method with inexact argument type for last parameter; [javac] cast to java.lang.Class<?> for a varargs call [javac] cast to java.lang.Class<?>[] for a non-varargs call and to suppress this warning [javac] List<String> tokenIdents = (List<String>)invokeOnRawStore("getAllTokenIdentifiers", null, null); [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] 2 warnings [echo] Building shims 0.23 build-shims: [echo] Project: shims [echo] Compiling /data/hive-ptest/working/apache-svn-trunk-source/shims/src/common/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/src/common-secure/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/src/0.23/java against hadoop 2.0.5-alpha (/data/hive-ptest/working/apache-svn-trunk-source/build/hadoopcore/hadoop-2.0.5-alpha) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.12.0-SNAPSHOT [ivy:resolve] confs: [hadoop0.23.shim] [ivy:resolve] found org.apache.hadoop#hadoop-common;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-annotations;2.0.5-alpha in maven2 [ivy:resolve] found com.google.guava#guava;11.0.2 in maven2 [ivy:resolve] found com.google.code.findbugs#jsr305;1.3.9 in maven2 [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found org.apache.commons#commons-math;2.1 in maven2 [ivy:resolve] found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] found commons-httpclient#commons-httpclient;3.1 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] found commons-io#commons-io;2.1 in maven2 [ivy:resolve] found commons-net#commons-net;3.1 in maven2 [ivy:resolve] found javax.servlet#servlet-api;2.5 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty;6.1.26 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.26 in maven2 [ivy:resolve] found com.sun.jersey#jersey-core;1.8 in maven2 [ivy:resolve] found com.sun.jersey#jersey-json;1.8 in maven2 [ivy:resolve] found org.codehaus.jettison#jettison;1.1 in maven2 [ivy:resolve] found stax#stax-api;1.0.1 in maven2 [ivy:resolve] found com.sun.xml.bind#jaxb-impl;2.2.3-1 in maven2 [ivy:resolve] found javax.xml.bind#jaxb-api;2.2.2 in maven2 [ivy:resolve] found javax.activation#activation;1.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-core-asl;1.8.8 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-mapper-asl;1.8.8 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-jaxrs;1.8.8 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-xc;1.8.8 in maven2 [ivy:resolve] found com.sun.jersey#jersey-server;1.8 in maven2 [ivy:resolve] found asm#asm;3.2 in maven2 [ivy:resolve] found log4j#log4j;1.2.17 in maven2 [ivy:resolve] found net.java.dev.jets3t#jets3t;0.6.1 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.5 in maven2 [ivy:resolve] found commons-configuration#commons-configuration;1.6 in maven2 [ivy:resolve] found commons-collections#commons-collections;3.2.1 in maven2 [ivy:resolve] found commons-digester#commons-digester;1.8 in maven2 [ivy:resolve] found commons-beanutils#commons-beanutils;1.7.0 in maven2 [ivy:resolve] found commons-beanutils#commons-beanutils-core;1.8.0 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] found org.apache.avro#avro;1.5.3 in maven2 [ivy:resolve] found com.thoughtworks.paranamer#paranamer;2.3 in maven2 [ivy:resolve] found org.xerial.snappy#snappy-java;1.0.3.2 in maven2 [ivy:resolve] found net.sf.kosmosfs#kfs;0.3 in maven2 [ivy:resolve] found com.google.protobuf#protobuf-java;2.4.0a in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-auth;2.0.5-alpha in maven2 [ivy:resolve] found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] found com.jcraft#jsch;0.1.42 in maven2 [ivy:resolve] found org.apache.zookeeper#zookeeper;3.4.2 in maven2 [ivy:resolve] found tomcat#jasper-compiler;5.5.23 in maven2 [ivy:resolve] found tomcat#jasper-runtime;5.5.23 in maven2 [ivy:resolve] found commons-el#commons-el;1.0 in maven2 [ivy:resolve] found javax.servlet.jsp#jsp-api;2.1 in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-mapreduce-client-core;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-common;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-api;2.0.5-alpha in maven2 [ivy:resolve] found com.google.inject.extensions#guice-servlet;3.0 in maven2 [ivy:resolve] found com.google.inject#guice;3.0 in maven2 [ivy:resolve] found javax.inject#javax.inject;1 in maven2 [ivy:resolve] found aopalliance#aopalliance;1.0 in maven2 [ivy:resolve] found org.sonatype.sisu.inject#cglib;2.2.1-v20090111 in maven2 [ivy:resolve] found io.netty#netty;3.5.11.Final in maven2 [ivy:resolve] found com.sun.jersey.jersey-test-framework#jersey-test-framework-grizzly2;1.8 in maven2 [ivy:resolve] found com.sun.jersey.contribs#jersey-guice;1.8 in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-archives;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-hdfs;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-mapreduce-client-common;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-client;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-server-common;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-server-tests;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-server-nodemanager;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-yarn-server-web-proxy;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-mapreduce-client-app;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.0.5-alpha in maven2 [ivy:resolve] found org.apache.hadoop#hadoop-mapreduce-client-hs;2.0.5-alpha in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.0.5-alpha/hadoop-common-2.0.5-alpha.jar ... [ivy:resolve] ...................................... (2295kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-common;2.0.5-alpha!hadoop-common.jar (54ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/2.0.5-alpha/hadoop-common-2.0.5-alpha-tests.jar ... [ivy:resolve] .................... (1151kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-common;2.0.5-alpha!hadoop-common.jar(tests) (59ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-core/2.0.5-alpha/hadoop-mapreduce-client-core-2.0.5-alpha.jar ... [ivy:resolve] ...................... (1325kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-core;2.0.5-alpha!hadoop-mapreduce-client-core.jar (43ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-archives/2.0.5-alpha/hadoop-archives-2.0.5-alpha.jar ... [ivy:resolve] .. (20kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-archives;2.0.5-alpha!hadoop-archives.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.0.5-alpha/hadoop-hdfs-2.0.5-alpha.jar ... [ivy:resolve] ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... (4241kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-hdfs;2.0.5-alpha!hadoop-hdfs.jar (353ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs/2.0.5-alpha/hadoop-hdfs-2.0.5-alpha-tests.jar ... [ivy:resolve] ............................ (1631kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-hdfs;2.0.5-alpha!hadoop-hdfs.jar(tests) (72ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.0.5-alpha/hadoop-mapreduce-client-jobclient-2.0.5-alpha-tests.jar ... [ivy:resolve] ....................... (1350kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.0.5-alpha!hadoop-mapreduce-client-jobclient.jar(tests) (91ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.0.5-alpha/hadoop-mapreduce-client-jobclient-2.0.5-alpha.jar ... [ivy:resolve] .. (32kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.0.5-alpha!hadoop-mapreduce-client-jobclient.jar (62ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-common/2.0.5-alpha/hadoop-mapreduce-client-common-2.0.5-alpha.jar ... [ivy:resolve] ........... (579kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-common;2.0.5-alpha!hadoop-mapreduce-client-common.jar (43ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-tests/2.0.5-alpha/hadoop-yarn-server-tests-2.0.5-alpha-tests.jar ... [ivy:resolve] .. (39kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-tests;2.0.5-alpha!hadoop-yarn-server-tests.jar(tests) (72ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-app/2.0.5-alpha/hadoop-mapreduce-client-app-2.0.5-alpha.jar ... [ivy:resolve] ......... (463kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-app;2.0.5-alpha!hadoop-mapreduce-client-app.jar (39ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-hs/2.0.5-alpha/hadoop-mapreduce-client-hs-2.0.5-alpha.jar ... [ivy:resolve] ... (111kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-hs;2.0.5-alpha!hadoop-mapreduce-client-hs.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-annotations/2.0.5-alpha/hadoop-annotations-2.0.5-alpha.jar ... [ivy:resolve] .. (16kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-annotations;2.0.5-alpha!hadoop-annotations.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar ... [ivy:resolve] ...... (297kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-httpclient#commons-httpclient;3.1!commons-httpclient.jar (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-net/commons-net/3.1/commons-net-3.1.jar ... [ivy:resolve] ...... (266kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-net#commons-net;3.1!commons-net.jar (47ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar ... [ivy:resolve] ... (102kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.jar (35ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/log4j/log4j/1.2.17/log4j-1.2.17.jar ... [ivy:resolve] ......... (478kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] log4j#log4j;1.2.17!log4j.jar(bundle) (38ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-lang/commons-lang/2.5/commons-lang-2.5.jar ... [ivy:resolve] ...... (272kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-lang#commons-lang;2.5!commons-lang.jar (36ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar ... [ivy:resolve] .. (24kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.slf4j#slf4j-api;1.6.1!slf4j-api.jar (32ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/avro/avro/1.5.3/avro-1.5.3.jar ... [ivy:resolve] ...... (257kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.avro#avro;1.5.3!avro.jar (35ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/protobuf/protobuf-java/2.4.0a/protobuf-java-2.4.0a.jar ... [ivy:resolve] ........ (439kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.protobuf#protobuf-java;2.4.0a!protobuf-java.jar (39ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-auth/2.0.5-alpha/hadoop-auth-2.0.5-alpha.jar ... [ivy:resolve] .. (46kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-auth;2.0.5-alpha!hadoop-auth.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar ... [ivy:resolve] .... (181kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.jcraft#jsch;0.1.42!jsch.jar (29ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/zookeeper/zookeeper/3.4.2/zookeeper-3.4.2.jar ... [ivy:resolve] .............. (746kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.zookeeper#zookeeper;3.4.2!zookeeper.jar (45ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar ... [ivy:resolve] .. (32kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.code.findbugs#jsr305;1.3.9!jsr305.jar (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackson/jackson-jaxrs/1.8.8/jackson-jaxrs-1.8.8.jar ... [ivy:resolve] .. (17kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jackson#jackson-jaxrs;1.8.8!jackson-jaxrs.jar (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackson/jackson-xc/1.8.8/jackson-xc-1.8.8.jar ... [ivy:resolve] .. (31kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.codehaus.jackson#jackson-xc;1.8.8!jackson-xc.jar (72ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/asm/asm/3.2/asm-3.2.jar ... [ivy:resolve] .. (42kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] asm#asm;3.2!asm.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar ... [ivy:resolve] .. (28kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.thoughtworks.paranamer#paranamer;2.3!paranamer.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/xerial/snappy/snappy-java/1.0.3.2/snappy-java-1.0.3.2.jar ... [ivy:resolve] ................. (972kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.xerial.snappy#snappy-java;1.0.3.2!snappy-java.jar(bundle) (46ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar ... [ivy:resolve] .. (9kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.slf4j#slf4j-log4j12;1.6.1!slf4j-log4j12.jar (26ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar ... [ivy:resolve] ........ (398kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] tomcat#jasper-compiler;5.5.23!jasper-compiler.jar (39ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar ... [ivy:resolve] ... (75kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] tomcat#jasper-runtime;5.5.23!jasper-runtime.jar (32ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar ... [ivy:resolve] ... (98kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.servlet.jsp#jsp-api;2.1!jsp-api.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-common/2.0.5-alpha/hadoop-yarn-common-2.0.5-alpha.jar ... [ivy:resolve] .................. (1050kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-common;2.0.5-alpha!hadoop-yarn-common.jar (46ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar ... [ivy:resolve] .. (63kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.inject.extensions#guice-servlet;3.0!guice-servlet.jar (28ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/io/netty/netty/3.5.11.Final/netty-3.5.11.Final.jar ... [ivy:resolve] ................... (1106kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] io.netty#netty;3.5.11.Final!netty.jar(bundle) (57ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-api/2.0.5-alpha/hadoop-yarn-api-2.0.5-alpha.jar ... [ivy:resolve] ................. (1014kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-api;2.0.5-alpha!hadoop-yarn-api.jar (43ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/inject/guice/3.0/guice-3.0.jar ... [ivy:resolve] ............. (693kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.inject#guice;3.0!guice.jar (36ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jersey-test-framework/jersey-test-framework-grizzly2/1.8/jersey-test-framework-grizzly2-1.8.jar ... [ivy:resolve] .. (12kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.sun.jersey.jersey-test-framework#jersey-test-framework-grizzly2;1.8!jersey-test-framework-grizzly2.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/contribs/jersey-guice/1.8/jersey-guice-1.8.jar ... [ivy:resolve] .. (14kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.sun.jersey.contribs#jersey-guice;1.8!jersey-guice.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/inject/javax.inject/1/javax.inject-1.jar ... [ivy:resolve] .. (2kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.inject#javax.inject;1!javax.inject.jar (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/aopalliance/aopalliance/1.0/aopalliance-1.0.jar ... [ivy:resolve] .. (4kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] aopalliance#aopalliance;1.0!aopalliance.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/sonatype/sisu/inject/cglib/2.2.1-v20090111/cglib-2.2.1-v20090111.jar ... [ivy:resolve] ...... (272kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.sonatype.sisu.inject#cglib;2.2.1-v20090111!cglib.jar (29ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-client/2.0.5-alpha/hadoop-yarn-client-2.0.5-alpha.jar ... [ivy:resolve] .. (28kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-client;2.0.5-alpha!hadoop-yarn-client.jar (40ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-common/2.0.5-alpha/hadoop-yarn-server-common-2.0.5-alpha.jar ... [ivy:resolve] .... (148kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-common;2.0.5-alpha!hadoop-yarn-server-common.jar (35ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-nodemanager/2.0.5-alpha/hadoop-yarn-server-nodemanager-2.0.5-alpha.jar ... [ivy:resolve] ........ (404kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-nodemanager;2.0.5-alpha!hadoop-yarn-server-nodemanager.jar (39ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-resourcemanager/2.0.5-alpha/hadoop-yarn-server-resourcemanager-2.0.5-alpha.jar ... [ivy:resolve] .......... (517kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-resourcemanager;2.0.5-alpha!hadoop-yarn-server-resourcemanager.jar (63ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-yarn-server-web-proxy/2.0.5-alpha/hadoop-yarn-server-web-proxy-2.0.5-alpha.jar ... [ivy:resolve] .. (24kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-web-proxy;2.0.5-alpha!hadoop-yarn-server-web-proxy.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.0.5-alpha/hadoop-mapreduce-client-shuffle-2.0.5-alpha.jar ... [ivy:resolve] .. (20kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.0.5-alpha!hadoop-mapreduce-client-shuffle.jar (28ms) [ivy:resolve] :: resolution report :: resolve 55392ms :: artifacts dl 2424ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | hadoop0.23.shim | 74 | 47 | 47 | 0 || 77 | 50 | --------------------------------------------------------------------- ivy-retrieve-hadoop-shim: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] confs: [hadoop0.23.shim] [ivy:retrieve] 77 artifacts copied, 0 already retrieved (31997kB/132ms) [javac] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/shims/classes [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/src/0.23/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. jar: [echo] Project: shims [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/build/shims/hive-shims-0.12.0-SNAPSHOT.jar :: delivering :: org.apache.hive#hive-shims;0.12.0-SNAPSHOT :: 0.12.0-SNAPSHOT :: integration :: Thu Aug 08 14:12:06 EDT 2013 delivering ivy file to /data/hive-ptest/working/apache-svn-trunk-source/build/shims/ivy-0.12.0-SNAPSHOT.xml :: publishing :: org.apache.hive#hive-shims published hive-shims to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-shims/0.12.0-SNAPSHOT/jars/hive-shims.jar published ivy to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-shims/0.12.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: common check-ivy: [echo] Project: common ivy-resolve: [echo] Project: common [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-common;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found org.apache.hive#hive-shims;0.12.0-SNAPSHOT in local [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] found org.tukaani#xz;1.0 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found log4j#log4j;1.2.16 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hive/hive-shims/0.12.0-SNAPSHOT/jars/hive-shims.jar ... [ivy:resolve] .... (128kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hive#hive-shims;0.12.0-SNAPSHOT!hive-shims.jar (4ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar ... [ivy:resolve] ..... (235kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.commons#commons-compress;1.4.1!commons-compress.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/tukaani/xz/1.0/xz-1.0.jar ... [ivy:resolve] ... (92kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.tukaani#xz;1.0!xz.jar (22ms) [ivy:resolve] :: resolution report :: resolve 1632ms :: artifacts dl 57ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 6 | 3 | 3 | 0 || 6 | 3 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-common-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-common-default.html make-pom: [echo] Project: common [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-source/build/common/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:makepom] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: common init: [echo] Project: common setup: [echo] Project: common ivy-retrieve: [echo] Project: common [ivy:retrieve] :: retrieving :: org.apache.hive#hive-common [ivy:retrieve] confs: [default] [ivy:retrieve] 4 artifacts copied, 2 already retrieved (497kB/13ms) compile: [echo] Project: common [javac] Compiling 25 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/common/classes [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [copy] Copying 1 file to /data/hive-ptest/working/apache-svn-trunk-source/build/common/classes jar: [echo] Project: common [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/build/common/hive-common-0.12.0-SNAPSHOT.jar :: delivering :: org.apache.hive#hive-common;0.12.0-SNAPSHOT :: 0.12.0-SNAPSHOT :: integration :: Thu Aug 08 14:12:16 EDT 2013 delivering ivy file to /data/hive-ptest/working/apache-svn-trunk-source/build/common/ivy-0.12.0-SNAPSHOT.xml :: publishing :: org.apache.hive#hive-common published hive-common to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-common/0.12.0-SNAPSHOT/jars/hive-common.jar published ivy to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-common/0.12.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: serde check-ivy: [echo] Project: serde ivy-resolve: [echo] Project: serde [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-serde;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found org.apache.hive#hive-common;0.12.0-SNAPSHOT in local [ivy:resolve] found org.apache.hive#hive-shims;0.12.0-SNAPSHOT in local [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] found org.tukaani#xz;1.0 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found log4j#log4j;1.2.16 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] found org.mockito#mockito-all;1.8.2 in maven2 [ivy:resolve] found org.apache.thrift#libfb303;0.9.0 in maven2 [ivy:resolve] found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] found org.apache.avro#avro;1.7.1 in maven2 [ivy:resolve] found org.apache.avro#avro-mapred;1.7.1 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hive/hive-common/0.12.0-SNAPSHOT/jars/hive-common.jar ... [ivy:resolve] ... (95kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hive#hive-common;0.12.0-SNAPSHOT!hive-common.jar (4ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mockito/mockito-all/1.8.2/mockito-all-1.8.2.jar ... [ivy:resolve] ...................... (1315kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.mockito#mockito-all;1.8.2!mockito-all.jar (59ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar ... [ivy:resolve] ...... (268kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.thrift#libfb303;0.9.0!libfb303.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/avro/avro/1.7.1/avro-1.7.1.jar ... [ivy:resolve] ...... (290kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.avro#avro;1.7.1!avro.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/avro/avro-mapred/1.7.1/avro-mapred-1.7.1.jar ... [ivy:resolve] .... (164kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.avro#avro-mapred;1.7.1!avro-mapred.jar (23ms) [ivy:resolve] :: resolution report :: resolve 5164ms :: artifacts dl 154ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 14 | 5 | 5 | 0 || 14 | 5 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-serde-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-serde-default.html make-pom: [echo] Project: serde [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-source/build/serde/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:makepom] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: serde [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources does not exist. init: [echo] Project: serde ivy-retrieve: [echo] Project: serde [ivy:retrieve] :: retrieving :: org.apache.hive#hive-serde [ivy:retrieve] confs: [default] [ivy:retrieve] 8 artifacts copied, 6 already retrieved (2227kB/11ms) dynamic-serde: compile: [echo] Project: serde [javac] Compiling 325 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/serde/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] Creating empty /data/hive-ptest/working/apache-svn-trunk-source/build/serde/classes/org/apache/hadoop/hive/serde2/typeinfo/package-info.class jar: [echo] Project: serde [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/build/serde/hive-serde-0.12.0-SNAPSHOT.jar :: delivering :: org.apache.hive#hive-serde;0.12.0-SNAPSHOT :: 0.12.0-SNAPSHOT :: integration :: Thu Aug 08 14:12:30 EDT 2013 delivering ivy file to /data/hive-ptest/working/apache-svn-trunk-source/build/serde/ivy-0.12.0-SNAPSHOT.xml :: publishing :: org.apache.hive#hive-serde published hive-serde to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-serde/0.12.0-SNAPSHOT/jars/hive-serde.jar published ivy to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-serde/0.12.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: metastore check-ivy: [echo] Project: metastore ivy-resolve: [echo] Project: metastore [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-metastore;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found org.apache.hive#hive-serde;0.12.0-SNAPSHOT in local [ivy:resolve] found org.apache.hive#hive-common;0.12.0-SNAPSHOT in local [ivy:resolve] found org.apache.hive#hive-shims;0.12.0-SNAPSHOT in local [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] found org.tukaani#xz;1.0 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found log4j#log4j;1.2.16 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] found org.mockito#mockito-all;1.8.2 in maven2 [ivy:resolve] found org.apache.thrift#libfb303;0.9.0 in maven2 [ivy:resolve] found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] found org.apache.avro#avro;1.7.1 in maven2 [ivy:resolve] found org.apache.avro#avro-mapred;1.7.1 in maven2 [ivy:resolve] found org.antlr#antlr;3.4 in maven2 [ivy:resolve] found org.antlr#antlr-runtime;3.4 in maven2 [ivy:resolve] found org.antlr#ST4;4.0.4 in maven2 [ivy:resolve] found com.jolbox#bonecp;0.7.1.RELEASE in maven2 [ivy:resolve] found com.google.guava#guava;r08 in maven2 [ivy:resolve] found commons-pool#commons-pool;1.5.4 in maven2 [ivy:resolve] found org.datanucleus#datanucleus-api-jdo;3.2.1 in maven2 [ivy:resolve] found org.datanucleus#datanucleus-core;3.2.2 in maven2 [ivy:resolve] found org.datanucleus#datanucleus-rdbms;3.2.1 in maven2 [ivy:resolve] found javax.jdo#jdo-api;3.0.1 in maven2 [ivy:resolve] found org.apache.derby#derby;10.4.2.0 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hive/hive-serde/0.12.0-SNAPSHOT/jars/hive-serde.jar ... [ivy:resolve] ............ (660kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hive#hive-serde;0.12.0-SNAPSHOT!hive-serde.jar (18ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/antlr/antlr/3.4/antlr-3.4.jar ... [ivy:resolve] ................... (1086kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.antlr#antlr;3.4!antlr.jar (31ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar ... [ivy:resolve] .... (160kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.antlr#antlr-runtime;3.4!antlr-runtime.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/antlr/ST4/4.0.4/ST4-4.0.4.jar ... [ivy:resolve] ..... (231kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.antlr#ST4;4.0.4!ST4.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/jolbox/bonecp/0.7.1.RELEASE/bonecp-0.7.1.RELEASE.jar ... [ivy:resolve] ... (112kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.jolbox#bonecp;0.7.1.RELEASE!bonecp.jar(bundle) (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-pool/commons-pool/1.5.4/commons-pool-1.5.4.jar ... [ivy:resolve] ... (93kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] commons-pool#commons-pool;1.5.4!commons-pool.jar (28ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar ... [ivy:resolve] ....... (329kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.datanucleus#datanucleus-api-jdo;3.2.1!datanucleus-api-jdo.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar ... [ivy:resolve] ............................. (1759kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.datanucleus#datanucleus-core;3.2.2!datanucleus-core.jar (34ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar ... [ivy:resolve] .............................. (1728kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.datanucleus#datanucleus-rdbms;3.2.1!datanucleus-rdbms.jar (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar ... [ivy:resolve] ..... (196kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javax.jdo#jdo-api;3.0.1!jdo-api.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar ... [ivy:resolve] ........................................ (2389kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.derby#derby;10.4.2.0!derby.jar (60ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/guava/guava/r08/guava-r08.jar ... [ivy:resolve] ................... (1088kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.guava#guava;r08!guava.jar (45ms) [ivy:resolve] :: resolution report :: resolve 7882ms :: artifacts dl 386ms [ivy:resolve] :: evicted modules: [ivy:resolve] org.slf4j#slf4j-api;1.5.10 by [org.slf4j#slf4j-api;1.6.1] in [default] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 27 | 12 | 12 | 1 || 26 | 12 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-metastore-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-metastore-default.html make-pom: [echo] Project: metastore [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:makepom] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: metastore [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources does not exist. init: [echo] Project: metastore metastore-init: [echo] Project: metastore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/gen/antlr/gen-java/org/apache/hadoop/hive/metastore/parser ivy-retrieve: [echo] Project: metastore [ivy:retrieve] :: retrieving :: org.apache.hive#hive-metastore [ivy:retrieve] confs: [default] [ivy:retrieve] 12 artifacts copied, 14 already retrieved (9836kB/29ms) build-grammar: [echo] Project: metastore [echo] Building Grammar /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g .... model-compile: [echo] Project: metastore [javac] Compiling 24 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/classes [copy] Copying 1 file to /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/classes core-compile: [echo] Project: metastore [javac] Compiling 104 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] Creating empty /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/classes/org/apache/hadoop/hive/metastore/parser/package-info.class model-enhance: [echo] Project: metastore [datanucleusenhancer] log4j:WARN No appenders could be found for logger (DataNucleus.General). [datanucleusenhancer] log4j:WARN Please initialize the log4j system properly. [datanucleusenhancer] log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. [datanucleusenhancer] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" [datanucleusenhancer] DataNucleus Enhancer : Classpath [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/service/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/common/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/serde/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ql/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/beeline/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/cli/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/shims/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/hwi/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/jdbc/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/hbase-handler/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/anttasks/hive-anttasks-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/common/hive-common-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/serde/hive-serde-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/shims/hive-shims-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/activation-1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/ant-1.6.5.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/asm-3.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-beanutils-1.7.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-beanutils-core-1.8.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-cli-1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-codec-1.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-collections-3.2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-configuration-1.6.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-digester-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-el-1.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-httpclient-3.0.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-io-2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-lang-2.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-logging-1.1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-math-2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/commons-net-1.4.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/core-3.1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/ftplet-api-1.0.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/ftpserver-core-1.0.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/ftpserver-deprecated-1.0.0-M2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/hadoop-core-1.1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/hadoop-test-1.1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/hadoop-tools-1.1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/hsqldb-1.8.0.10.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jackson-core-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jackson-jaxrs-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jackson-mapper-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jackson-xc-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jasper-compiler-5.5.12.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jasper-runtime-5.5.12.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jaxb-api-2.2.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jaxb-impl-2.2.3-1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jersey-core-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jersey-json-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jersey-server-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jets3t-0.6.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jettison-1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jetty-6.1.26.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jetty-util-6.1.26.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jsp-2.1-6.1.14.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/jsp-api-2.1-6.1.14.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/junit-3.8.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/mina-core-2.0.0-M5.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/oro-2.0.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/servlet-api-2.5-20081211.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/servlet-api-2.5-6.1.14.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/slf4j-api-1.5.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/stax-api-1.0-2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/stax-api-1.0.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/hadoop0.20S.shim/xmlenc-0.52.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/ST4-4.0.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/antlr-3.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/antlr-runtime-3.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/avro-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/avro-mapred-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/bonecp-0.7.1.RELEASE.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-cli-1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-codec-1.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-compress-1.4.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-io-2.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-lang-2.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-logging-1.0.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-logging-api-1.0.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/commons-pool-1.5.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/datanucleus-api-jdo-3.2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/datanucleus-core-3.2.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/datanucleus-rdbms-3.2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/derby-10.4.2.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/guava-11.0.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/guava-r08.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/hive-common-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/hive-serde-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/hive-shims-0.12.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/jackson-core-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/jackson-mapper-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/jdo-api-3.0.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/libfb303-0.9.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/libthrift-0.9.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/log4j-1.2.16.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/mockito-all-1.8.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/slf4j-api-1.6.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/slf4j-log4j12-1.6.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/velocity-1.5.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/xz-1.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/lib/default/zookeeper-3.4.3.jar [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics [datanucleusenhancer] DataNucleus Enhancer completed with success for 24 classes. Timings : input=703 ms, enhance=1142 ms, total=1845 ms. Consult the log for full details compile: [echo] Project: metastore jar: [echo] Project: metastore [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/hive-metastore-0.12.0-SNAPSHOT.jar :: delivering :: org.apache.hive#hive-metastore;0.12.0-SNAPSHOT :: 0.12.0-SNAPSHOT :: integration :: Thu Aug 08 14:13:07 EDT 2013 delivering ivy file to /data/hive-ptest/working/apache-svn-trunk-source/build/metastore/ivy-0.12.0-SNAPSHOT.xml :: publishing :: org.apache.hive#hive-metastore published hive-metastore to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-metastore/0.12.0-SNAPSHOT/jars/hive-metastore.jar published ivy to /data/hive-ptest/working/ivy/local/org.apache.hive/hive-metastore/0.12.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: ql check-ivy: [echo] Project: ql ivy-resolve: [echo] Project: ql [ivy:resolve] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-exec;0.12.0-SNAPSHOT [ivy:resolve] confs: [default] [ivy:resolve] found org.apache.hive#hive-metastore;0.12.0-SNAPSHOT in local [ivy:resolve] found org.apache.hive#hive-serde;0.12.0-SNAPSHOT in local [ivy:resolve] found org.apache.hive#hive-common;0.12.0-SNAPSHOT in local [ivy:resolve] found org.apache.hive#hive-shims;0.12.0-SNAPSHOT in local [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] found org.tukaani#xz;1.0 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found log4j#log4j;1.2.16 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] found org.mockito#mockito-all;1.8.2 in maven2 [ivy:resolve] found org.apache.thrift#libfb303;0.9.0 in maven2 [ivy:resolve] found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] found org.apache.avro#avro;1.7.1 in maven2 [ivy:resolve] found org.apache.avro#avro-mapred;1.7.1 in maven2 [ivy:resolve] found org.antlr#antlr;3.4 in maven2 [ivy:resolve] found org.antlr#antlr-runtime;3.4 in maven2 [ivy:resolve] found org.antlr#ST4;4.0.4 in maven2 [ivy:resolve] found com.jolbox#bonecp;0.7.1.RELEASE in maven2 [ivy:resolve] found com.google.guava#guava;r08 in maven2 [ivy:resolve] found commons-pool#commons-pool;1.5.4 in maven2 [ivy:resolve] found org.datanucleus#datanucleus-api-jdo;3.2.1 in maven2 [ivy:resolve] found org.datanucleus#datanucleus-core;3.2.2 in maven2 [ivy:resolve] found org.datanucleus#datanucleus-rdbms;3.2.1 in maven2 [ivy:resolve] found javax.jdo#jdo-api;3.0.1 in maven2 [ivy:resolve] found org.apache.derby#derby;10.4.2.0 in maven2 [ivy:resolve] found com.google.protobuf#protobuf-java;2.4.1 in maven2 [ivy:resolve] found org.iq80.snappy#snappy;0.2 in maven2 [ivy:resolve] found org.json#json;20090211 in maven2 [ivy:resolve] found commons-collections#commons-collections;3.2.1 in maven2 [ivy:resolve] found commons-configuration#commons-configuration;1.6 in maven2 [ivy:resolve] found com.googlecode.javaewah#JavaEWAH;0.3.2 in maven2 [ivy:resolve] found javolution#javolution;5.5.1 in maven2 [ivy:resolve] found jline#jline;0.9.94 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hive/hive-metastore/0.12.0-SNAPSHOT/jars/hive-metastore.jar ... [ivy:resolve] ..................................................... (3265kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.apache.hive#hive-metastore;0.12.0-SNAPSHOT!hive-metastore.jar (74ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/protobuf/protobuf-java/2.4.1/protobuf-java-2.4.1.jar ... [ivy:resolve] ........ (439kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.google.protobuf#protobuf-java;2.4.1!protobuf-java.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/iq80/snappy/snappy/0.2/snappy-0.2.jar ... [ivy:resolve] .. (47kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.iq80.snappy#snappy;0.2!snappy.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/json/json/20090211/json-20090211.jar ... [ivy:resolve] .. (44kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] org.json#json;20090211!json.jar (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/googlecode/javaewah/JavaEWAH/0.3.2/JavaEWAH-0.3.2.jar ... [ivy:resolve] .. (16kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] com.googlecode.javaewah#JavaEWAH;0.3.2!JavaEWAH.jar (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javolution/javolution/5.5.1/javolution-5.5.1.jar ... [ivy:resolve] ........ (385kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] javolution#javolution;5.5.1!javolution.jar(bundle) (16ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/jline/jline/0.9.94/jline-0.9.94.jar ... [ivy:resolve] ... (85kB) [ivy:resolve] .. (0kB) [ivy:resolve] [SUCCESSFUL ] jline#jline;0.9.94!jline.jar (12ms) [ivy:resolve] :: resolution report :: resolve 5550ms :: artifacts dl 179ms [ivy:resolve] :: evicted modules: [ivy:resolve] org.slf4j#slf4j-api;1.5.10 by [org.slf4j#slf4j-api;1.6.1] in [default] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 36 | 7 | 7 | 1 || 35 | 7 | --------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/resolution-cache/org.apache.hive-hive-exec-default.xml to /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hive-exec-default.html make-pom: [echo] Project: ql [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-source/build/ql/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:makepom] :: loading settings :: file = /data/hive-ptest/working/apache-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: ql init: [echo] Project: ql ql-init: [echo] Project: ql [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/build/ql/gen/antlr/gen-java/org/apache/hadoop/hive/ql/parse ivy-retrieve: [echo] Project: ql [ivy:retrieve] :: retrieving :: org.apache.hive#hive-exec [ivy:retrieve] confs: [default] [ivy:retrieve] 9 artifacts copied, 26 already retrieved (5139kB/22ms) build-grammar: [echo] Project: ql [echo] Building Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g .... [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:866:5: [java] Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10 [java] [java] As a result, alternative(s) 10 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1167:5: [java] Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1167:5: [java] Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1167:5: [java] Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1167:5: [java] Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1180:23: [java] Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1180:23: [java] Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1180:23: [java] Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1187:23: [java] Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1187:23: [java] Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1187:23: [java] Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1205:29: [java] Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1205:29: [java] Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1205:29: [java] Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1205:29: [java] Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1205:29: [java] Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1205:29: [java] Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4 [java] [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1476:116: [java] Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1599:5: [java] Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1599:5: [java] Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1599:5: [java] Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1599:5: [java] Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1599:5: [java] Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): SelectClauseParser.g:149:5: [java] Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): SelectClauseParser.g:149:5: [java] Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:127:2: [java] Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:25: [java] Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:25: [java] Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:25: [java] Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68: [java] Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16: [java] Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4: [java] Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:108:5: [java] Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:121:5: [java] Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:133:5: [java] Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:144:5: [java] Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:155:5: [java] Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:172:7: [java] Decision can match input such as "STAR" using multiple alternatives: 1, 2 [java] [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:184:5: [java] Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): IdentifiersParser.g:184:5: [java] Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): IdentifiersParser.g:184:5: [java] Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 [java] [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): IdentifiersParser.g:266:5: [java] Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 [java] [java] As a result, alternative(s) 8 were disabled for that input [java] warning(200): IdentifiersParser.g:266:5: [java] Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 [java] [java] As a result, alternative(s) 3 were disabled for that input [java] warning(200): IdentifiersParser.g:266:5: [java] Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 [java] [java] As a result, alternative(s) 8 were disabled for that input [java] warning(200): IdentifiersParser.g:266:5: [java] Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 [java] [java] As a result, alternative(s) 8 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 6, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:389:5: [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 7 [java] [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:513:5: [java] Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 [java] [java] As a result, alternative(s) 3 were disabled for that input compile: [echo] Project: ql [javac] Compiling 904 source files to /data/hive-ptest/working/apache-svn-trunk-source/build/ql/classes [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/MapOperator.java:177: cannot find symbol [javac] symbol : class LinkedHashMap [javac] location: class org.apache.hadoop.hive.ql.exec.MapOperator [javac] LinkedHashMap<String, String> partSpec = pd.getPartSpec(); [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/MapOperator.java:205: cannot find symbol [javac] symbol : variable partTblObjectInspectorConverter [javac] location: class org.apache.hadoop.hive.ql.exec.MapOperator [javac] partTblObjectInspectorConverter = [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/MapOperator.java:242: cannot find symbol [javac] symbol : variable ctx [javac] location: class org.apache.hadoop.hive.ql.exec.MapOperator [javac] if (ctx.op instanceof TableScanOperator) { [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/MapOperator.java:243: cannot find symbol [javac] symbol : variable ctx [javac] location: class org.apache.hadoop.hive.ql.exec.MapOperator [javac] TableScanOperator tsOp = (TableScanOperator) ctx.op; [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/MapOperator.java:377: initObjectInspector(org.apache.hadoop.hive.ql.plan.MapWork,org.apache.hadoop.conf.Configuration,java.lang.String,java.util.Map<org.apache.hadoop.hive.ql.plan.TableDesc,org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector>) in org.apache.hadoop.hive.ql.exec.MapOperator cannot be applied to (org.apache.hadoop.conf.Configuration,org.apache.hadoop.hive.ql.exec.MapOperator.MapInputPath,java.util.Map<org.apache.hadoop.hive.ql.plan.TableDesc,org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector>) [javac] MapOpCtx opCtx = initObjectInspector(hconf, inp, convertedOI); [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] 5 errors BUILD FAILED /data/hive-ptest/working/apache-svn-trunk-source/build.xml:327: The following error occurred while executing this line: /data/hive-ptest/working/apache-svn-trunk-source/build.xml:166: The following error occurred while executing this line: /data/hive-ptest/working/apache-svn-trunk-source/build.xml:168: The following error occurred while executing this line: /data/hive-ptest/working/apache-svn-trunk-source/ql/build.xml:198: Compile failed; see the compiler error output for details. Total time: 4 minutes 4 seconds + exit 1 '
This message is automatically generated.
Overall: -1 at least one tests failed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12596693/HIVE-4545.3.patch
ERROR: -1 due to 2 failed/errored test(s), 2773 tests executed
Failed tests:
org.apache.hcatalog.mapreduce.TestSequenceFileReadWrite.testTextTableWriteReadMR org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_ppd_vc
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/348/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/348/console
Messages:
Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests failed with: TestsFailedException: 2 tests failed
This message is automatically generated.
ppd_vc.q is fixed in HIVE-5033, but TestSequenceFileReadWrite.testTextTableWriteReadMR seemed should be fixed.
TestSequenceFileReadWrite.testTextTableWriteReadMR passed on my box. Making it patch available again to see if this is reproducible/related to this change.
I mean, making it patch available again to kick off precommit tests again, and check if the failure is related to this patch.
brocknoland Canceling this patch and making it patch available hasn't kicked off a new pre-commit test. Does the pre-commit test kickoff require a patch file newer than previous one that tests were run on?
Long story short I don't think that will work. The script that kicks off these precommit jobs is shared between a bunch of projects and is generally need of a re-write. The jira HADOOP-9765 is on my queue.
If you'd like a new run, either:
1) Submit a new patch
2) Click build here entering your jira number as a parameter: https://builds.apache.org/job/PreCommit-HIVE-Build/ (any committer to any project can do that)
brocknoland Thanks for the advice! And thanks again for setting up the pre-commit builds!
It's ok. It seems a bit clunky to have:
HIVE_HUMAN_FRIENDLY_FORMAT("hive.human.friendly.format", true),
Maybe we simply need describe terse or something.
appodictic Without this patch, 'describe table' in beeline running against hive server2 returns something like
'a ','int ','None '
instead of
'a','int',''
While it might make sense for the hive cli to add space padding and replace empty comments with None, that is not a good idea for jdbc application. For example a jdbc based gui tool will take of formatting on its side.
The goal is to be able to get 'raw' data from the familiar DDL commands without any changes to make it more human readable.
Overall: -1 at least one tests failed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12596693/HIVE-4545.3.patch
ERROR: -1 due to 1 failed/errored test(s), 2887 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_udtf_not_supported2
Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/467/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/467/console
Messages:
Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests failed with: TestsFailedException: 1 tests failed
This message is automatically generated.
TestNegativeCliDriver.testNegativeCliDriver_udtf_not_supported2 failure is unrelated to this change, it is consistently happening after HIVE-2608 commit.
I'm wondering if we need this flag at all. From your description it seems that you just want different behavior in HS2. Isn't there a way to just check for that? Is there a reason you want to have the ability to set to false in CLI cases?
Comments on rb
hagleitn I agree. I think I should not add a new configuration for this. Unformatted output is right for HS2 and formatted output is right for hive cli.
But right now there is no way to know if the action is under CLI or HS2. I am thinking of adding a SessionState method that can be set and other parts of code can use to determine if its under CLI or HS2.
New Rb link: https://reviews.apache.org/r/18390/ (couldn't update the previous one as Thejas was the creator).
This patch gets rid of the new config that was introduced in the previous patch (per hagleitn's feedback) by adding a way to detect whether the query is being served from HiveServer2.
hagleitn Whenever you get time, the jira is up for review. Thanks in advance!
Overall: -1 at least one tests failed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12630444/HIVE-4545.5.patch
ERROR: -1 due to 1 failed/errored test(s), 5177 tests executed
Failed tests:
org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_bucketizedhiveinputformat
Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1472/testReport
Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1472/console
Messages:
Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed
This message is automatically generated.
ATTACHMENT ID: 12630444
HIVE-4545-1.patch - introduces hive.human.friendly.format config. By default set to true. It is set off by hive-server2 .When set to off, in 'describe table' and 'show columns' it turns off space padding.