Hive
  1. Hive
  2. HIVE-5706

Move a few numeric UDFs to generic implementations

    Details

    • Type: Task Task
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.12.0
    • Fix Version/s: 0.13.0
    • Component/s: UDF
    • Labels:
      None

      Description

      This is a follow-up JIRA for HIVE-5356 to reduce the review scope. It will cover UDFOPPositive, UDFOPNegative, UDFCeil, UDFFloor, and UDFPower.

      1. HIVE-5706.7.patch
        124 kB
        Xuefu Zhang
      2. HIVE-5706.6.patch
        124 kB
        Xuefu Zhang
      3. HIVE-5706.5.patch
        124 kB
        Xuefu Zhang
      4. HIVE-5706.4.patch
        48 kB
        Xuefu Zhang
      5. HIVE-5706.3.patch
        122 kB
        Xuefu Zhang
      6. HIVE-5706.2.patch
        122 kB
        Xuefu Zhang
      7. HIVE-5706.1.patch
        120 kB
        Xuefu Zhang
      8. HIVE-5706.patch
        55 kB
        Xuefu Zhang

        Issue Links

          Activity

          Hide
          Brock Noland added a comment -

          Thank you for your contribution Xuefu! I have committed this to trunk!

          Show
          Brock Noland added a comment - Thank you for your contribution Xuefu! I have committed this to trunk!
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12615888/HIVE-5706.7.patch

          SUCCESS: +1 4735 tests passed

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/452/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/452/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          ATTACHMENT ID: 12615888

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12615888/HIVE-5706.7.patch SUCCESS: +1 4735 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/452/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/452/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12615888
          Hide
          Brock Noland added a comment -

          +1

          Show
          Brock Noland added a comment - +1
          Hide
          Xuefu Zhang added a comment -

          Patch #7 is identical to #6 exact using IllegalStateException instead of RuntimeException in one case to accomodate RB request.

          Show
          Xuefu Zhang added a comment - Patch #7 is identical to #6 exact using IllegalStateException instead of RuntimeException in one case to accomodate RB request.
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12615781/HIVE-5706.6.patch

          SUCCESS: +1 4732 tests passed

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/442/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/442/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          ATTACHMENT ID: 12615781

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12615781/HIVE-5706.6.patch SUCCESS: +1 4732 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/442/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/442/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12615781
          Hide
          Xuefu Zhang added a comment -

          Patch #6 is equivalent to #5. Rebased and reloaded to kick off test after the build was fixed.

          Show
          Xuefu Zhang added a comment - Patch #6 is equivalent to #5. Rebased and reloaded to kick off test after the build was fixed.
          Hide
          Hive QA added a comment -

          Overall: -1 no tests executed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12615765/HIVE-5706.5.patch

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/441/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/441/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
          + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + cd /data/hive-ptest/working/
          + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-441/source-prep.txt
          + [[ false == \t\r\u\e ]]
          + mkdir -p maven ivy
          + [[ svn = \s\v\n ]]
          + [[ -n '' ]]
          + [[ -d apache-svn-trunk-source ]]
          + [[ ! -d apache-svn-trunk-source/.svn ]]
          + [[ ! -d apache-svn-trunk-source ]]
          + cd apache-svn-trunk-source
          + svn revert -R .
          ++ awk '{print $2}'
          ++ egrep -v '^X|^Performing status on external'
          ++ svn status --no-ignore
          + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target
          + svn update
          U    ql/src/test/queries/clientpositive/insert_into3.q
          U    ql/src/test/results/compiler/plan/input_testxpath.q.xml
          U    ql/src/test/results/compiler/plan/input_part1.q.xml
          U    ql/src/test/results/compiler/plan/input1.q.xml
          U    ql/src/test/results/compiler/plan/input2.q.xml
          U    ql/src/test/results/compiler/plan/input3.q.xml
          U    ql/src/test/results/compiler/plan/input4.q.xml
          U    ql/src/test/results/compiler/plan/input5.q.xml
          U    ql/src/test/results/compiler/plan/input6.q.xml
          U    ql/src/test/results/compiler/plan/input_testxpath2.q.xml
          U    ql/src/test/results/compiler/plan/input7.q.xml
          U    ql/src/test/results/compiler/plan/input8.q.xml
          U    ql/src/test/results/compiler/plan/input_testsequencefile.q.xml
          U    ql/src/test/results/compiler/plan/input9.q.xml
          U    ql/src/test/results/compiler/plan/udf1.q.xml
          U    ql/src/test/results/compiler/plan/input20.q.xml
          U    ql/src/test/results/compiler/plan/udf4.q.xml
          U    ql/src/test/results/compiler/plan/sample1.q.xml
          U    ql/src/test/results/compiler/plan/sample2.q.xml
          U    ql/src/test/results/compiler/plan/udf6.q.xml
          U    ql/src/test/results/compiler/plan/sample3.q.xml
          U    ql/src/test/results/compiler/plan/sample4.q.xml
          U    ql/src/test/results/compiler/plan/sample5.q.xml
          U    ql/src/test/results/compiler/plan/sample6.q.xml
          U    ql/src/test/results/compiler/plan/sample7.q.xml
          U    ql/src/test/results/compiler/plan/groupby1.q.xml
          U    ql/src/test/results/compiler/plan/groupby2.q.xml
          U    ql/src/test/results/compiler/plan/udf_case.q.xml
          U    ql/src/test/results/compiler/plan/groupby3.q.xml
          U    ql/src/test/results/compiler/plan/subq.q.xml
          U    ql/src/test/results/compiler/plan/cast1.q.xml
          U    ql/src/test/results/compiler/plan/groupby4.q.xml
          U    ql/src/test/results/compiler/plan/groupby5.q.xml
          U    ql/src/test/results/compiler/plan/groupby6.q.xml
          U    ql/src/test/results/compiler/plan/join1.q.xml
          U    ql/src/test/results/compiler/plan/join2.q.xml
          U    ql/src/test/results/compiler/plan/join3.q.xml
          U    ql/src/test/results/compiler/plan/join4.q.xml
          U    ql/src/test/results/compiler/plan/join5.q.xml
          U    ql/src/test/results/compiler/plan/join6.q.xml
          U    ql/src/test/results/compiler/plan/case_sensitivity.q.xml
          U    ql/src/test/results/compiler/plan/join7.q.xml
          U    ql/src/test/results/compiler/plan/join8.q.xml
          U    ql/src/test/results/compiler/plan/union.q.xml
          U    ql/src/test/results/compiler/plan/udf_when.q.xml
          U    ql/src/test/results/clientpositive/insert_into3.q.out
          U    ql/src/test/org/apache/hadoop/hive/ql/testutil/OperatorTestUtils.java
          U    ql/src/test/org/apache/hadoop/hive/ql/exec/TestOperators.java
          U    ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorGroupByOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/QueryPlan.java
          U    ql/src/java/org/apache/hadoop/hive/ql/parse/MapReduceCompiler.java
          U    ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
          U    ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
          A    ql/src/java/org/apache/hadoop/hive/ql/metadata/HiveFatalException.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/DemuxOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/mr/MapredLocalTask.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHelper.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHook.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecReducer.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/AbstractMapJoinOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/CommonJoinOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/MuxOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/GroupByOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/FetchOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/MapJoinOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/SMBMapJoinOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorFileSinkOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/OperatorFactory.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java
          U    ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java
          U    ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java
          U    ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/merge/BlockMergeTask.java
          U    ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java
          U    conf/hive-default.xml.template
          U    data/conf/hive-site.xml
          U    common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
          
          Fetching external item into 'hcatalog/src/test/e2e/harness'
          Updated external to revision 1545504.
          
          Updated to revision 1545504.
          + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
          + patchFilePath=/data/hive-ptest/working/scratch/build.patch
          + [[ -f /data/hive-ptest/working/scratch/build.patch ]]
          + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
          + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
          Going to apply patch with: patch -p0
          patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericUnaryOp.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFloor.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFBaseUnary.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFCeil.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFFloor.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFFloorCeilBase.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPNegative.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPPositive.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFPower.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorizationContext.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFCeil.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFFloor.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPNegative.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPPositive.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFPower.java
          patching file ql/src/test/results/clientpositive/decimal_udf.q.out
          patching file ql/src/test/results/clientpositive/literal_decimal.q.out
          patching file ql/src/test/results/clientpositive/udf4.q.out
          patching file ql/src/test/results/clientpositive/udf7.q.out
          patching file ql/src/test/results/clientpositive/vectorization_short_regress.q.out
          patching file ql/src/test/results/clientpositive/vectorized_math_funcs.q.out
          patching file ql/src/test/results/compiler/plan/udf4.q.xml
          Hunk #1 succeeded at 591 (offset -16 lines).
          Hunk #2 succeeded at 669 (offset -16 lines).
          Hunk #3 succeeded at 679 (offset -16 lines).
          Hunk #4 succeeded at 704 (offset -16 lines).
          Hunk #5 succeeded at 729 (offset -16 lines).
          Hunk #6 succeeded at 758 (offset -16 lines).
          Hunk #7 succeeded at 818 (offset -16 lines).
          Hunk #8 succeeded at 875 (offset -16 lines).
          Hunk #9 succeeded at 904 (offset -16 lines).
          Hunk #10 succeeded at 914 (offset -16 lines).
          Hunk #11 succeeded at 939 (offset -16 lines).
          Hunk #12 succeeded at 978 (offset -16 lines).
          Hunk #13 succeeded at 1048 (offset -16 lines).
          + [[ maven == \m\a\v\e\n ]]
          + rm -rf /data/hive-ptest/working/maven/org/apache/hive
          + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Build Order:
          [INFO] 
          [INFO] Hive
          [INFO] Hive Ant Utilities
          [INFO] Hive Shims Common
          [INFO] Hive Shims 0.20
          [INFO] Hive Shims Secure Common
          [INFO] Hive Shims 0.20S
          [INFO] Hive Shims 0.23
          [INFO] Hive Shims
          [INFO] Hive Common
          [INFO] Hive Serde
          [INFO] Hive Metastore
          [INFO] Hive Query Language
          [INFO] Hive Service
          [INFO] Hive JDBC
          [INFO] Hive Beeline
          [INFO] Hive CLI
          [INFO] Hive Contrib
          [INFO] Hive HBase Handler
          [INFO] Hive HCatalog
          [INFO] Hive HCatalog Core
          [INFO] Hive HCatalog Pig Adapter
          [INFO] Hive HCatalog Server Extensions
          [INFO] Hive HCatalog Webhcat Java Client
          [INFO] Hive HCatalog Webhcat
          [INFO] Hive HCatalog HBase Storage Handler
          [INFO] Hive HWI
          [INFO] Hive ODBC
          [INFO] Hive Shims Aggregator
          [INFO] Hive TestUtils
          [INFO] Hive Packaging
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
          [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
          [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
          [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
          [WARNING] JAR will be empty - no content was marked for inclusion!
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
          [INFO] Reading assembly descriptor: src/assemble/uberjar.xml
          [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
          Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
          NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
          [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
          with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
          [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 4 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
          [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Serde 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
          [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
          [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Metastore 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
          ANTLR Parser Generator  Version 3.4
          org/apache/hadoop/hive/metastore/parser/Filter.g
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
          [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
          [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
          DataNucleus Enhancer : Classpath
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
          >>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
          >>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
          >>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          >>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
          >>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar
          >>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
          >>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
          >>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
          >>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
          >>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
          >>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
          >>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
          >>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
          >>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
          >>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
          >>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
          >>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
          >>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
          >>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
          >>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
          >>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
          >>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar
          >>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
          >>  /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
          >>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
          >>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
          >>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
          >>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
          >>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
          >>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
          >>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
          >>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
          >>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
          DataNucleus Enhancer completed with success for 25 classes. Timings : input=607 ms, enhance=944 ms, total=1551 ms. Consult the log for full details
          
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
          [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Query Language 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
          Generating vector expression code
          Generating vector expression test code
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
          ANTLR Parser Generator  Version 3.4
          org/apache/hadoop/hive/ql/parse/HiveLexer.g
          org/apache/hadoop/hive/ql/parse/HiveParser.g
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5: 
          Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10
          
          As a result, alternative(s) 10 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
          Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
          Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
          Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
          Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
          Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
          Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116: 
          Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): SelectClauseParser.g:149:5: 
          Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): SelectClauseParser.g:149:5: 
          Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:127:2: 
          Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:25: 
          Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:25: 
          Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:25: 
          Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:108:5: 
          Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:121:5: 
          Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:133:5: 
          Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:144:5: 
          Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:155:5: 
          Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:172:7: 
          Decision can match input such as "STAR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:185:5: 
          Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:185:5: 
          Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:185:5: 
          Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
          
          As a result, alternative(s) 3 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:524:5: 
          Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
          
          As a result, alternative(s) 3 were disabled for that input
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
          [INFO] Compiling 1397 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources) @ hive-exec ---
          [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-exec ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 4 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-exec ---
          [INFO] Compiling 143 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-exec ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-exec ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-exec ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-shade-plugin:2.1:shade (build-exec-bundle) @ hive-exec ---
          [INFO] Excluding org.apache.hive:hive-ant:jar:0.13.0-SNAPSHOT from the shaded jar.
          [INFO] Excluding org.apache.velocity:velocity:jar:1.5 from the shaded jar.
          [INFO] Excluding commons-collections:commons-collections:jar:3.1 from the shaded jar.
          [INFO] Including org.apache.hive:hive-common:jar:0.13.0-SNAPSHOT in the shaded jar.
          [INFO] Excluding commons-cli:commons-cli:jar:1.2 from the shaded jar.
          [INFO] Excluding org.apache.hive:hive-metastore:jar:0.13.0-SNAPSHOT from the shaded jar.
          [INFO] Excluding com.jolbox:bonecp:jar:0.7.1.RELEASE from the shaded jar.
          [INFO] Excluding org.apache.derby:derby:jar:10.4.2.0 from the shaded jar.
          [INFO] Excluding org.datanucleus:datanucleus-api-jdo:jar:3.2.1 from the shaded jar.
          [INFO] Excluding org.datanucleus:datanucleus-rdbms:jar:3.2.1 from the shaded jar.
          [INFO] Excluding javax.jdo:jdo-api:jar:3.0.1 from the shaded jar.
          [INFO] Excluding javax.transaction:jta:jar:1.1 from the shaded jar.
          [INFO] Including org.apache.hive:hive-serde:jar:0.13.0-SNAPSHOT in the shaded jar.
          [INFO] Including org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT in the shaded jar.
          [INFO] Including com.esotericsoftware.kryo:kryo:jar:2.22 in the shaded jar.
          [INFO] Excluding commons-codec:commons-codec:jar:1.4 from the shaded jar.
          [INFO] Excluding commons-httpclient:commons-httpclient:jar:3.0.1 from the shaded jar.
          [INFO] Excluding commons-io:commons-io:jar:2.4 from the shaded jar.
          [INFO] Including commons-lang:commons-lang:jar:2.4 in the shaded jar.
          [INFO] Excluding commons-logging:commons-logging:jar:1.1.3 from the shaded jar.
          [INFO] Including javolution:javolution:jar:5.5.1 in the shaded jar.
          [INFO] Excluding log4j:log4j:jar:1.2.16 from the shaded jar.
          [INFO] Excluding org.antlr:antlr-runtime:jar:3.4 from the shaded jar.
          [INFO] Excluding org.antlr:stringtemplate:jar:3.2.1 from the shaded jar.
          [INFO] Excluding antlr:antlr:jar:2.7.7 from the shaded jar.
          [INFO] Excluding org.antlr:ST4:jar:4.0.4 from the shaded jar.
          [INFO] Excluding org.apache.avro:avro:jar:1.7.5 from the shaded jar.
          [INFO] Excluding com.thoughtworks.paranamer:paranamer:jar:2.3 from the shaded jar.
          [INFO] Excluding org.xerial.snappy:snappy-java:jar:1.0.5 from the shaded jar.
          [INFO] Excluding org.apache.avro:avro-mapred:jar:1.7.5 from the shaded jar.
          [INFO] Excluding org.apache.avro:avro-ipc:jar:1.7.5 from the shaded jar.
          [INFO] Excluding io.netty:netty:jar:3.4.0.Final from the shaded jar.
          [INFO] Excluding org.mortbay.jetty:servlet-api:jar:2.5-20081211 from the shaded jar.
          [INFO] Excluding org.apache.avro:avro-ipc:jar:tests:1.7.5 from the shaded jar.
          [INFO] Excluding org.apache.ant:ant:jar:1.9.1 from the shaded jar.
          [INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.1 from the shaded jar.
          [INFO] Excluding org.apache.commons:commons-compress:jar:1.4.1 from the shaded jar.
          [INFO] Excluding org.tukaani:xz:jar:1.0 from the shaded jar.
          [INFO] Excluding org.apache.thrift:libfb303:jar:0.9.0 from the shaded jar.
          [INFO] Including org.apache.thrift:libthrift:jar:0.9.0 in the shaded jar.
          [INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.2.5 from the shaded jar.
          [INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.2.4 from the shaded jar.
          [INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.5 from the shaded jar.
          [INFO] Excluding jline:jline:jar:0.9.94 from the shaded jar.
          [INFO] Excluding org.codehaus.groovy:groovy-all:jar:2.1.6 from the shaded jar.
          [INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.2 in the shaded jar.
          [INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.2 in the shaded jar.
          [INFO] Excluding org.datanucleus:datanucleus-core:jar:3.2.2 from the shaded jar.
          [INFO] Including com.google.guava:guava:jar:11.0.2 in the shaded jar.
          [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar.
          [INFO] Including com.google.protobuf:protobuf-java:jar:2.5.0 in the shaded jar.
          [INFO] Including com.googlecode.javaewah:JavaEWAH:jar:0.3.2 in the shaded jar.
          [INFO] Including org.iq80.snappy:snappy:jar:0.2 in the shaded jar.
          [INFO] Including org.json:json:jar:20090211 in the shaded jar.
          [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar.
          [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded jar.
          [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar.
          [INFO] Excluding com.sun.jersey:jersey-core:jar:1.14 from the shaded jar.
          [INFO] Excluding com.sun.jersey:jersey-json:jar:1.14 from the shaded jar.
          [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar.
          [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar.
          [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar.
          [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar.
          [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar.
          [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.2 from the shaded jar.
          [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.2 from the shaded jar.
          [INFO] Excluding com.sun.jersey:jersey-server:jar:1.14 from the shaded jar.
          [INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
          [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded jar.
          [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar.
          [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar.
          [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar.
          [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar.
          [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar.
          [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
          [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
          [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar.
          [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar.
          [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar.
          [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded jar.
          [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar.
          [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar.
          [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar.
          [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar.
          [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar.
          [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar.
          [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar.
          [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar.
          [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.5 from the shaded jar.
          [INFO] Replacing original artifact with shaded artifact.
          [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar
          [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml
          [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Service 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service ---
          [INFO] Compiling 153 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service ---
          [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive JDBC 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc ---
          [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-jdbc ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-jdbc ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-jdbc ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-jdbc ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-jdbc ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/hive-jdbc-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-jdbc ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/hive-jdbc-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-jdbc/0.13.0-SNAPSHOT/hive-jdbc-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/jdbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-jdbc/0.13.0-SNAPSHOT/hive-jdbc-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Beeline 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-beeline ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/beeline (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-beeline ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 2 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-beeline ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-beeline ---
          [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/classes
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[28,16] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[29,16] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[31,64] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[44,23] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[37,24] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[37,5] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-beeline ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-beeline ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-beeline ---
          [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-beeline ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-beeline ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/hive-beeline-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-beeline ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/hive-beeline-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-beeline/0.13.0-SNAPSHOT/hive-beeline-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/beeline/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-beeline/0.13.0-SNAPSHOT/hive-beeline-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive CLI 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-cli ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/cli (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-cli ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-cli ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-cli ---
          [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/classes
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[74,16] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[75,16] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[371,5] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[372,5] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[377,27] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[378,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[378,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[383,28] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[378,19] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[439,9] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/RCFileCat.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-cli ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-cli ---
          [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/test-classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/org/apache/hadoop/hive/cli/TestCliDriverMethods.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-cli ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-cli ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Contrib 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-contrib ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/contrib (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-contrib ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contrib ---
          [INFO] Compiling 39 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/java/org/apache/hadoop/hive/contrib/udf/example/UDFExampleStructPrint.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-contrib ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-contrib ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/test-classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/org/apache/hadoop/hive/contrib/serde2/TestRegexSerDe.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-contrib ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-contrib ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler ---
          [INFO] Compiling 17 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-handler ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-handler ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-handler ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog/0.13.0-SNAPSHOT/hive-hcatalog-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-core ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-core ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-core ---
          [INFO] Compiling 144 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-core ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-core ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-core ---
          [INFO] Compiling 67 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-core ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-core ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hcatalog-core ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-core ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Pig Adapter 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-pig-adapter ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-pig-adapter ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-pig-adapter ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-pig-adapter ---
          [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-pig-adapter ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-pig-adapter ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-pig-adapter ---
          [INFO] Compiling 26 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-pig-adapter ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-pig-adapter ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-pig-adapter ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-pig-adapter/0.13.0-SNAPSHOT/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-pig-adapter/0.13.0-SNAPSHOT/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Server Extensions 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-server-extensions ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-server-extensions ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-server-extensions ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-server-extensions ---
          [INFO] Compiling 38 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-server-extensions ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-server-extensions ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-server-extensions ---
          [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-server-extensions ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-server-extensions ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-server-extensions ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-server-extensions/0.13.0-SNAPSHOT/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-server-extensions/0.13.0-SNAPSHOT/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Webhcat Java Client 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-webhcat-java-client ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat-java-client ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat-java-client ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat-java-client ---
          [INFO] Compiling 20 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat-java-client ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat-java-client ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat-java-client ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat-java-client ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-webhcat-java-client ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-webhcat-java-client ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat-java-client/0.13.0-SNAPSHOT/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat-java-client/0.13.0-SNAPSHOT/hive-webhcat-java-client-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Webhcat 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-webhcat ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat ---
          [INFO] Compiling 65 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-javadoc-plugin:2.4:javadoc (resourcesdoc.xml) @ hive-webhcat ---
          [INFO] Setting property: classpath.resource.loader.class => 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'.
          [INFO] Setting property: velocimacro.messages.on => 'false'.
          [INFO] Setting property: resource.loader => 'classpath'.
          [INFO] Setting property: resource.manager.logwhenfound => 'false'.
          [INFO] ************************************************************** 
          [INFO] Starting Jakarta Velocity v1.4
          [INFO] RuntimeInstance initializing.
          [INFO] Default Properties File: org/apache/velocity/runtime/defaults/velocity.properties
          [INFO] Default ResourceManager initializing. (class org.apache.velocity.runtime.resource.ResourceManagerImpl)
          [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader
          [INFO] ClasspathResourceLoader : initialization starting.
          [INFO] ClasspathResourceLoader : initialization complete.
          [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.resource.ResourceCacheImpl)
          [INFO] Default ResourceManager initialization complete.
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Literal
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Include
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Foreach
          [INFO] Created: 20 parsers.
          [INFO] Velocimacro : initialization starting.
          [INFO] Velocimacro : adding VMs from VM library template : VM_global_library.vm
          [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in any resource loader.
          [INFO] Velocimacro : error using  VM library template VM_global_library.vm : org.apache.velocity.exception.ResourceNotFoundException: Unable to find resource 'VM_global_library.vm'
          [INFO] Velocimacro :  VM library template macro registration complete.
          [INFO] Velocimacro : allowInline = true : VMs can be defined inline in templates
          [INFO] Velocimacro : allowInlineToOverride = false : VMs defined inline may NOT replace previous VM definitions
          [INFO] Velocimacro : allowInlineLocal = false : VMs defined inline will be  global in scope if allowed.
          [INFO] Velocimacro : initialization complete.
          [INFO] Velocity successfully started.
          Loading source files for package org.apache.hive.hcatalog.templeton...
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleExceptionMapper.java]
          [parsing completed 27ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JsonBuilder.java]
          [parsing completed 8ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JobItemBean.java]
          [parsing completed 2ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.java]
          [parsing completed 11ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelegator.java]
          [parsing completed 30ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImpl.java]
          [parsing completed 19ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java]
          [parsing completed 4ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecBean.java]
          [parsing completed 8ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.java]
          [parsing completed 14ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TempletonDelegator.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDelegator.java]
          [parsing completed 12ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DatabaseDesc.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteBean.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.java]
          [parsing completed 20ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java]
          [parsing completed 143ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BadParam.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ColumnDesc.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PartitionDesc.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CatchallExceptionMapper.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.java]
          [parsing completed 43ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java]
          [parsing completed 6ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBean.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteDelegator.java]
          [parsing completed 14ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/GroupPermissionsDesc.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java]
          [parsing completed 27ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatException.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueException.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableLikeDesc.java]
          [parsing completed 4ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableDesc.java]
          [parsing completed 12ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/WadlConfig.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BusyException.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecService.java]
          [parsing completed 4ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/NotAuthorizedException.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/EnqueueBean.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleWebException.java]
          [parsing completed 6ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SecureProxySupport.java]
          [parsing completed 11ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/MaxByteArrayOutputStream.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ProxyUserSupport.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/UgiFactory.java]
          [parsing completed 3ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CallbackFailedException.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TablePropertyDesc.java]
          [parsing completed 0ms]
          Loading source files for package org.apache.hive.hcatalog.templeton.tool...
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage.java]
          [parsing completed 19ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullRecordReader.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/PigJobIDParser.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonStorage.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobIDParser.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.java]
          [parsing completed 6ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonUtils.java]
          [parsing completed 12ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStorage.java]
          [parsing completed 10ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/DelegationTokenCache.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JarJobIDParser.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobSubmissionConstants.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperCleanup.java]
          [parsing completed 2ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobStateTracker.java]
          [parsing completed 8ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NotFoundException.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/SingleInputFormat.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LogRetriever.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullSplit.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSCleanup.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HiveJobIDParser.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LaunchMapper.java]
          [parsing completed 19ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialExecService.java]
          [parsing completed 6ms]
          Constructing Javadoc information...
          [search path for source files: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java]
          [search path for class files: /usr/java/jdk1.6.0_34/jre/lib/resources.jar,/usr/java/jdk1.6.0_34/jre/lib/rt.jar,/usr/java/jdk1.6.0_34/jre/lib/sunrsasign.jar,/usr/java/jdk1.6.0_34/jre/lib/jsse.jar,/usr/java/jdk1.6.0_34/jre/lib/jce.jar,/usr/java/jdk1.6.0_34/jre/lib/charsets.jar,/usr/java/jdk1.6.0_34/jre/lib/modules/jdk.boot.jar,/usr/java/jdk1.6.0_34/jre/classes,/usr/java/jdk1.6.0_34/jre/lib/ext/localedata.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunpkcs11.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunjce_provider.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/dnsns.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar,/data/hive-ptest/working/maven/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar,/data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar,/data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/mail-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/activation.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar,/data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar,/data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar,/data/hive-ptest/working/maven/org/eclipse/jetty/aggregate/jetty-all-server/7.6.0.v20120127/jetty-all-server-7.6.0.v20120127.jar,/data/hive-ptest/working/maven/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar,/data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar,/data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar,/data/hive-ptest/working/maven/org/apache/velocity/velocity/1.5/velocity-1.5.jar,/data/hive-ptest/working/maven/com/jolbox/bonecp/0.7.1.RELEASE/bonecp-0.7.1.RELEASE.jar,/data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar,/data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar,/data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar,/data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar,/data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/ST4/4.0.4/ST4-4.0.4.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-exec/1.1/commons-exec-1.1.jar,/data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar,/data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar,/data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar,/data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar,/data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar,/data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar,/usr/java/jdk1.6.0_34/jre/../lib/tools.jar,/data/hive-ptest/working/maven/org/apache/ant/ant/1.9.1/ant-1.9.1.jar,/data/hive-ptest/working/maven/io/netty/netty/3.4.0.Final/netty-3.4.0.Final.jar,/data/hive-ptest/working/maven/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar,/data/hive-ptest/working/maven/com/sun/jersey/contribs/wadl-resourcedoc-doclet/1.4/wadl-resourcedoc-doclet-1.4.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-mapred/1.7.5/avro-mapred-1.7.5.jar,/data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar,/data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar,/data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar,/data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar,/data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar,/data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5-tests.jar,/data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/geronimo-annotation_1.0_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5.jar,/data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-servlet/1.14/jersey-servlet-1.14.jar,/data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar,/data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar,/data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar,/data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar,/data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar,/data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar,/data/hive-ptest/working/maven/org/codehaus/groovy/groovy-all/2.1.6/groovy-all-2.1.6.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_cs.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_de_DE.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_es.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_fr.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_hu.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_it.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ja_JP.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ko_KR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pl.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pt_BR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ru.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_CN.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_TW.jar,/data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/asm/asm-commons/3.1/asm-commons-3.1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/activation.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jsr173_1.0_api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb1-impl.jar,/data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-tools/1.2.1/hadoop-tools-1.2.1.jar,/data/hive-ptest/working/maven/asm/asm-tree/3.1/asm-tree-3.1.jar,/data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.2/paranamer-2.2.jar,/data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar,/data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar,/data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar,/data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jaspic_1.0_spec/1.0/geronimo-jaspic_1.0_spec-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar,/data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar,/data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar,/data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jta_1.1_spec/1.1.1/geronimo-jta_1.1_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/apache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar,/data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar,/data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar,/data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar]
          [loading javax/ws/rs/core/Response.class(javax/ws/rs/core:Response.class)]
          [loading javax/ws/rs/ext/ExceptionMapper.class(javax/ws/rs/ext:ExceptionMapper.class)]
          [loading javax/ws/rs/ext/Provider.class(javax/ws/rs/ext:Provider.class)]
          [loading java/io/IOException.class(java/io:IOException.class)]
          [loading java/util/Map.class(java/util:Map.class)]
          [loading java/util/HashMap.class(java/util:HashMap.class)]
          [loading javax/ws/rs/core/MediaType.class(javax/ws/rs/core:MediaType.class)]
          [loading org/codehaus/jackson/map/ObjectMapper.class(org/codehaus/jackson/map:ObjectMapper.class)]
          [loading java/lang/Throwable.class(java/lang:Throwable.class)]
          [loading java/io/Serializable.class(java/io:Serializable.class)]
          [loading java/lang/Object.class(java/lang:Object.class)]
          [loading java/lang/String.class(java/lang:String.class)]
          [loading java/io/ByteArrayOutputStream.class(java/io:ByteArrayOutputStream.class)]
          [loading org/apache/hadoop/hive/ql/ErrorMsg.class(org/apache/hadoop/hive/ql:ErrorMsg.class)]
          [loading org/eclipse/jetty/http/HttpStatus.class(org/eclipse/jetty/http:HttpStatus.class)]
          [loading java/lang/Integer.class(java/lang:Integer.class)]
          [loading org/apache/hadoop/mapred/JobStatus.class(org/apache/hadoop/mapred:JobStatus.class)]
          [loading org/apache/hadoop/mapred/JobProfile.class(org/apache/hadoop/mapred:JobProfile.class)]
          [loading java/lang/Long.class(java/lang:Long.class)]
          [loading java/util/ArrayList.class(java/util:ArrayList.class)]
          [loading java/util/List.class(java/util:List.class)]
          [loading org/apache/commons/logging/Log.class(org/apache/commons/logging:Log.class)]
          [loading org/apache/commons/logging/LogFactory.class(org/apache/commons/logging:LogFactory.class)]
          [loading org/apache/hadoop/conf/Configuration.class(org/apache/hadoop/conf:Configuration.class)]
          [loading java/lang/Enum.class(java/lang:Enum.class)]
          [loading java/lang/Comparable.class(java/lang:Comparable.class)]
          [loading java/lang/Exception.class(java/lang:Exception.class)]
          [loading java/io/FileNotFoundException.class(java/io:FileNotFoundException.class)]
          [loading java/net/URISyntaxException.class(java/net:URISyntaxException.class)]
          [loading org/apache/commons/exec/ExecuteException.class(org/apache/commons/exec:ExecuteException.class)]
          [loading java/security/PrivilegedExceptionAction.class(java/security:PrivilegedExceptionAction.class)]
          [loading org/apache/hadoop/fs/Path.class(org/apache/hadoop/fs:Path.class)]
          [loading org/apache/hadoop/hive/conf/HiveConf.class(org/apache/hadoop/hive/conf:HiveConf.class)]
          [loading org/apache/hadoop/security/UserGroupInformation.class(org/apache/hadoop/security:UserGroupInformation.class)]
          [loading org/apache/hadoop/util/StringUtils.class(org/apache/hadoop/util:StringUtils.class)]
          [loading org/apache/hadoop/util/ToolRunner.class(org/apache/hadoop/util:ToolRunner.class)]
          [loading java/io/File.class(java/io:File.class)]
          [loading java/net/URL.class(java/net:URL.class)]
          [loading org/apache/hadoop/util/VersionInfo.class(org/apache/hadoop/util:VersionInfo.class)]
          [loading java/lang/Iterable.class(java/lang:Iterable.class)]
          [loading org/apache/hadoop/io/Writable.class(org/apache/hadoop/io:Writable.class)]
          [loading java/lang/InterruptedException.class(java/lang:InterruptedException.class)]
          [loading java/io/BufferedReader.class(java/io:BufferedReader.class)]
          [loading java/io/InputStream.class(java/io:InputStream.class)]
          [loading java/io/InputStreamReader.class(java/io:InputStreamReader.class)]
          [loading java/io/OutputStream.class(java/io:OutputStream.class)]
          [loading java/io/PrintWriter.class(java/io:PrintWriter.class)]
          [loading java/util/Map$Entry.class(java/util:Map$Entry.class)]
          [loading java/util/concurrent/Semaphore.class(java/util/concurrent:Semaphore.class)]
          [loading org/apache/commons/exec/CommandLine.class(org/apache/commons/exec:CommandLine.class)]
          [loading org/apache/commons/exec/DefaultExecutor.class(org/apache/commons/exec:DefaultExecutor.class)]
          [loading org/apache/commons/exec/ExecuteWatchdog.class(org/apache/commons/exec:ExecuteWatchdog.class)]
          [loading org/apache/commons/exec/PumpStreamHandler.class(org/apache/commons/exec:PumpStreamHandler.class)]
          [loading org/apache/hadoop/util/Shell.class(org/apache/hadoop/util:Shell.class)]
          [loading java/lang/Thread.class(java/lang:Thread.class)]
          [loading java/lang/Runnable.class(java/lang:Runnable.class)]
          [loading org/apache/hadoop/hive/shims/HadoopShims.class(org/apache/hadoop/hive/shims:HadoopShims.class)]
          [loading org/apache/hadoop/hive/shims/HadoopShims$WebHCatJTShim.class(org/apache/hadoop/hive/shims:HadoopShims$WebHCatJTShim.class)]
          [loading org/apache/hadoop/hive/shims/ShimLoader.class(org/apache/hadoop/hive/shims:ShimLoader.class)]
          [loading org/apache/hadoop/mapred/JobID.class(org/apache/hadoop/mapred:JobID.class)]
          [loading java/util/Arrays.class(java/util:Arrays.class)]
          [loading javax/xml/bind/annotation/XmlRootElement.class(javax/xml/bind/annotation:XmlRootElement.class)]
          [loading java/net/InetAddress.class(java/net:InetAddress.class)]
          [loading java/net/UnknownHostException.class(java/net:UnknownHostException.class)]
          [loading java/text/MessageFormat.class(java/text:MessageFormat.class)]
          [loading java/util/Collections.class(java/util:Collections.class)]
          [loading java/util/regex/Matcher.class(java/util/regex:Matcher.class)]
          [loading java/util/regex/Pattern.class(java/util/regex:Pattern.class)]
          [loading javax/servlet/http/HttpServletRequest.class(javax/servlet/http:HttpServletRequest.class)]
          [loading javax/ws/rs/DELETE.class(javax/ws/rs:DELETE.class)]
          [loading javax/ws/rs/FormParam.class(javax/ws/rs:FormParam.class)]
          [loading javax/ws/rs/GET.class(javax/ws/rs:GET.class)]
          [loading javax/ws/rs/POST.class(javax/ws/rs:POST.class)]
          [loading javax/ws/rs/PUT.class(javax/ws/rs:PUT.class)]
          [loading javax/ws/rs/Path.class(javax/ws/rs:Path.class)]
          [loading javax/ws/rs/PathParam.class(javax/ws/rs:PathParam.class)]
          [loading javax/ws/rs/Produces.class(javax/ws/rs:Produces.class)]
          [loading javax/ws/rs/QueryParam.class(javax/ws/rs:QueryParam.class)]
          [loading javax/ws/rs/core/Context.class(javax/ws/rs/core:Context.class)]
          [loading javax/ws/rs/core/SecurityContext.class(javax/ws/rs/core:SecurityContext.class)]
          [loading javax/ws/rs/core/UriInfo.class(javax/ws/rs/core:UriInfo.class)]
          [loading org/apache/hadoop/security/authentication/client/PseudoAuthenticator.class(org/apache/hadoop/security/authentication/client:PseudoAuthenticator.class)]
          [loading com/sun/jersey/api/NotFoundException.class(com/sun/jersey/api:NotFoundException.class)]
          [loading java/net/URI.class(java/net:URI.class)]
          [loading org/apache/commons/lang/StringUtils.class(org/apache/commons/lang:StringUtils.class)]
          [loading org/apache/hadoop/fs/FileStatus.class(org/apache/hadoop/fs:FileStatus.class)]
          [loading org/apache/hadoop/fs/FileSystem.class(org/apache/hadoop/fs:FileSystem.class)]
          [loading java/util/Date.class(java/util:Date.class)]
          [loading org/apache/hadoop/hive/common/classification/InterfaceAudience.class(org/apache/hadoop/hive/common/classification:InterfaceAudience.class)]
          [loading org/apache/hadoop/hive/metastore/HiveMetaStoreClient.class(org/apache/hadoop/hive/metastore:HiveMetaStoreClient.class)]
          [loading org/apache/hive/hcatalog/common/HCatUtil.class(org/apache/hive/hcatalog/common:HCatUtil.class)]
          [loading org/apache/hadoop/hive/common/classification/InterfaceAudience$Private.class(org/apache/hadoop/hive/common/classification:InterfaceAudience$Private.class)]
          [loading com/sun/jersey/api/wadl/config/WadlGeneratorConfig.class(com/sun/jersey/api/wadl/config:WadlGeneratorConfig.class)]
          [loading com/sun/jersey/api/wadl/config/WadlGeneratorDescription.class(com/sun/jersey/api/wadl/config:WadlGeneratorDescription.class)]
          [loading com/sun/jersey/server/wadl/generators/resourcedoc/WadlGeneratorResourceDocSupport.class(com/sun/jersey/server/wadl/generators/resourcedoc:WadlGeneratorResourceDocSupport.class)]
          [loading com/sun/jersey/api/core/PackagesResourceConfig.class(com/sun/jersey/api/core:PackagesResourceConfig.class)]
          [loading com/sun/jersey/spi/container/servlet/ServletContainer.class(com/sun/jersey/spi/container/servlet:ServletContainer.class)]
          [loading org/apache/hadoop/hive/common/classification/InterfaceStability.class(org/apache/hadoop/hive/common/classification:InterfaceStability.class)]
          [loading org/apache/hadoop/hdfs/web/AuthFilter.class(org/apache/hadoop/hdfs/web:AuthFilter.class)]
          [loading org/apache/hadoop/util/GenericOptionsParser.class(org/apache/hadoop/util:GenericOptionsParser.class)]
          [loading org/eclipse/jetty/rewrite/handler/RedirectPatternRule.class(org/eclipse/jetty/rewrite/handler:RedirectPatternRule.class)]
          [loading org/eclipse/jetty/rewrite/handler/RewriteHandler.class(org/eclipse/jetty/rewrite/handler:RewriteHandler.class)]
          [loading org/eclipse/jetty/server/Handler.class(org/eclipse/jetty/server:Handler.class)]
          [loading org/eclipse/jetty/server/Server.class(org/eclipse/jetty/server:Server.class)]
          [loading org/eclipse/jetty/server/handler/HandlerList.class(org/eclipse/jetty/server/handler:HandlerList.class)]
          [loading org/eclipse/jetty/servlet/FilterHolder.class(org/eclipse/jetty/servlet:FilterHolder.class)]
          [loading org/eclipse/jetty/servlet/FilterMapping.class(org/eclipse/jetty/servlet:FilterMapping.class)]
          [loading org/eclipse/jetty/servlet/ServletContextHandler.class(org/eclipse/jetty/servlet:ServletContextHandler.class)]
          [loading org/eclipse/jetty/servlet/ServletHolder.class(org/eclipse/jetty/servlet:ServletHolder.class)]
          [loading org/slf4j/bridge/SLF4JBridgeHandler.class(org/slf4j/bridge:SLF4JBridgeHandler.class)]
          [loading org/apache/hadoop/hive/common/classification/InterfaceAudience$LimitedPrivate.class(org/apache/hadoop/hive/common/classification:InterfaceAudience$LimitedPrivate.class)]
          [loading org/apache/hadoop/hive/common/classification/InterfaceStability$Unstable.class(org/apache/hadoop/hive/common/classification:InterfaceStability$Unstable.class)]
          [loading org/apache/hadoop/hive/metastore/api/MetaException.class(org/apache/hadoop/hive/metastore/api:MetaException.class)]
          [loading org/apache/hadoop/io/Text.class(org/apache/hadoop/io:Text.class)]
          [loading org/apache/hadoop/security/Credentials.class(org/apache/hadoop/security:Credentials.class)]
          [loading org/apache/hadoop/security/token/Token.class(org/apache/hadoop/security/token:Token.class)]
          [loading org/apache/thrift/TException.class(org/apache/thrift:TException.class)]
          [loading java/io/Closeable.class(java/io:Closeable.class)]
          [loading java/io/Flushable.class(java/io:Flushable.class)]
          [loading org/apache/hadoop/security/Groups.class(org/apache/hadoop/security:Groups.class)]
          [loading java/util/HashSet.class(java/util:HashSet.class)]
          [loading java/util/Set.class(java/util:Set.class)]
          [loading java/util/concurrent/ConcurrentHashMap.class(java/util/concurrent:ConcurrentHashMap.class)]
          [loading java/io/UnsupportedEncodingException.class(java/io:UnsupportedEncodingException.class)]
          [loading org/apache/zookeeper/CreateMode.class(org/apache/zookeeper:CreateMode.class)]
          [loading org/apache/zookeeper/KeeperException.class(org/apache/zookeeper:KeeperException.class)]
          [loading org/apache/zookeeper/WatchedEvent.class(org/apache/zookeeper:WatchedEvent.class)]
          [loading org/apache/zookeeper/Watcher.class(org/apache/zookeeper:Watcher.class)]
          [loading org/apache/zookeeper/ZooDefs.class(org/apache/zookeeper:ZooDefs.class)]
          [loading org/apache/zookeeper/ZooDefs$Ids.class(org/apache/zookeeper:ZooDefs$Ids.class)]
          [loading org/apache/zookeeper/ZooKeeper.class(org/apache/zookeeper:ZooKeeper.class)]
          [loading org/apache/hadoop/io/NullWritable.class(org/apache/hadoop/io:NullWritable.class)]
          [loading org/apache/hadoop/mapreduce/InputSplit.class(org/apache/hadoop/mapreduce:InputSplit.class)]
          [loading org/apache/hadoop/mapreduce/RecordReader.class(org/apache/hadoop/mapreduce:RecordReader.class)]
          [loading org/apache/hadoop/mapreduce/TaskAttemptContext.class(org/apache/hadoop/mapreduce:TaskAttemptContext.class)]
          [loading java/net/URLConnection.class(java/net:URLConnection.class)]
          [loading java/util/Collection.class(java/util:Collection.class)]
          [loading javax/ws/rs/core/UriBuilder.class(javax/ws/rs/core:UriBuilder.class)]
          [loading java/io/OutputStreamWriter.class(java/io:OutputStreamWriter.class)]
          [loading org/apache/hadoop/hive/common/classification/InterfaceStability$Evolving.class(org/apache/hadoop/hive/common/classification:InterfaceStability$Evolving.class)]
          [loading org/apache/zookeeper/data/Stat.class(org/apache/zookeeper/data:Stat.class)]
          [loading org/apache/hadoop/mapreduce/InputFormat.class(org/apache/hadoop/mapreduce:InputFormat.class)]
          [loading org/apache/hadoop/mapreduce/JobContext.class(org/apache/hadoop/mapreduce:JobContext.class)]
          [loading org/apache/hadoop/mapred/JobClient.class(org/apache/hadoop/mapred:JobClient.class)]
          [loading org/apache/hadoop/mapred/JobConf.class(org/apache/hadoop/mapred:JobConf.class)]
          [loading org/apache/hadoop/mapred/RunningJob.class(org/apache/hadoop/mapred:RunningJob.class)]
          [loading java/io/DataInput.class(java/io:DataInput.class)]
          [loading java/io/DataOutput.class(java/io:DataOutput.class)]
          [loading org/apache/hadoop/conf/Configured.class(org/apache/hadoop/conf:Configured.class)]
          [loading org/apache/hadoop/fs/permission/FsPermission.class(org/apache/hadoop/fs/permission:FsPermission.class)]
          [loading org/apache/hadoop/mapreduce/Job.class(org/apache/hadoop/mapreduce:Job.class)]
          [loading org/apache/hadoop/mapreduce/JobID.class(org/apache/hadoop/mapreduce:JobID.class)]
          [loading org/apache/hadoop/mapreduce/lib/output/NullOutputFormat.class(org/apache/hadoop/mapreduce/lib/output:NullOutputFormat.class)]
          [loading org/apache/hadoop/mapreduce/security/token/delegation/DelegationTokenIdentifier.class(org/apache/hadoop/mapreduce/security/token/delegation:DelegationTokenIdentifier.class)]
          [loading org/apache/hadoop/util/Tool.class(org/apache/hadoop/util:Tool.class)]
          [loading org/apache/hadoop/conf/Configurable.class(org/apache/hadoop/conf:Configurable.class)]
          [loading java/lang/ClassNotFoundException.class(java/lang:ClassNotFoundException.class)]
          [loading org/apache/hadoop/mapreduce/Mapper.class(org/apache/hadoop/mapreduce:Mapper.class)]
          [loading java/util/Iterator.class(java/util:Iterator.class)]
          [loading java/util/LinkedList.class(java/util:LinkedList.class)]
          [loading java/util/concurrent/ExecutorService.class(java/util/concurrent:ExecutorService.class)]
          [loading java/util/concurrent/Executors.class(java/util/concurrent:Executors.class)]
          [loading java/util/concurrent/TimeUnit.class(java/util/concurrent:TimeUnit.class)]
          [loading org/apache/hadoop/mapreduce/Mapper$Context.class(org/apache/hadoop/mapreduce:Mapper$Context.class)]
          [loading java/lang/Process.class(java/lang:Process.class)]
          [loading java/lang/StringBuilder.class(java/lang:StringBuilder.class)]
          [loading java/lang/ProcessBuilder.class(java/lang:ProcessBuilder.class)]
          [loading java/lang/annotation/Target.class(java/lang/annotation:Target.class)]
          [loading java/lang/annotation/ElementType.class(java/lang/annotation:ElementType.class)]
          [loading java/lang/annotation/Retention.class(java/lang/annotation:Retention.class)]
          [loading java/lang/annotation/RetentionPolicy.class(java/lang/annotation:RetentionPolicy.class)]
          [loading java/lang/annotation/Annotation.class(java/lang/annotation:Annotation.class)]
          [loading java/lang/SuppressWarnings.class(java/lang:SuppressWarnings.class)]
          [loading java/lang/Override.class(java/lang:Override.class)]
          [loading javax/ws/rs/HttpMethod.class(javax/ws/rs:HttpMethod.class)]
          [loading java/lang/Deprecated.class(java/lang:Deprecated.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$3.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatDelegator$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/LauncherDelegator$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$2.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatException$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/LogRetriever$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$2.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonUtils$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/HDFSStorage$1.class]
          [done in 6731 ms]
          [WARNING] Javadoc Warnings
          [WARNING] Nov 25, 2013 8:52:23 PM com.sun.jersey.wadl.resourcedoc.ResourceDoclet start
          [WARNING] INFO: Wrote /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/resourcedoc.xml
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat ---
          [INFO] Compiling 9 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/test-classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/test/java/org/apache/hive/hcatalog/templeton/TestDesc.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-webhcat ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/hive-webhcat-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-webhcat ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/hive-webhcat-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat/0.13.0-SNAPSHOT/hive-webhcat-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat/0.13.0-SNAPSHOT/hive-webhcat-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog HBase Storage Handler 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-storage-handler ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-hbase-storage-handler ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/gen-java added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-storage-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-storage-handler ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-storage-handler ---
          [INFO] Compiling 36 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-storage-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-storage-handler ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-storage-handler ---
          [INFO] Compiling 21 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-storage-handler ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-storage-handler ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hbase-storage-handler ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-storage-handler ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HWI 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hwi ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hwi (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hwi ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi ---
          [INFO] Compiling 6 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hwi ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.13.0-SNAPSHOT/hive-hwi-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.13.0-SNAPSHOT/hive-hwi-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive ODBC 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-odbc ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/odbc (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-odbc ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/odbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-odbc/0.13.0-SNAPSHOT/hive-odbc-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Aggregator 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-aggregator ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-aggregator ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims-aggregator/0.13.0-SNAPSHOT/hive-shims-aggregator-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive TestUtils 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-testutils ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/testutils (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-testutils ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-testutils ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-testutils ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-testutils ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.13.0-SNAPSHOT/hive-testutils-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.13.0-SNAPSHOT/hive-testutils-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Packaging 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-packaging ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/packaging (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-packaging ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-packaging ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-packaging ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/packaging/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-packaging/0.13.0-SNAPSHOT/hive-packaging-0.13.0-SNAPSHOT.pom
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive .............................................. SUCCESS [2.890s]
          [INFO] Hive Ant Utilities ................................ SUCCESS [7.580s]
          [INFO] Hive Shims Common ................................. SUCCESS [3.378s]
          [INFO] Hive Shims 0.20 ................................... SUCCESS [2.225s]
          [INFO] Hive Shims Secure Common .......................... SUCCESS [2.819s]
          [INFO] Hive Shims 0.20S .................................. SUCCESS [1.458s]
          [INFO] Hive Shims 0.23 ................................... SUCCESS [3.796s]
          [INFO] Hive Shims ........................................ SUCCESS [3.597s]
          [INFO] Hive Common ....................................... SUCCESS [14.500s]
          [INFO] Hive Serde ........................................ SUCCESS [11.538s]
          [INFO] Hive Metastore .................................... SUCCESS [25.097s]
          [INFO] Hive Query Language ............................... SUCCESS [50.810s]
          [INFO] Hive Service ...................................... SUCCESS [4.608s]
          [INFO] Hive JDBC ......................................... SUCCESS [1.894s]
          [INFO] Hive Beeline ...................................... SUCCESS [1.810s]
          [INFO] Hive CLI .......................................... SUCCESS [1.358s]
          [INFO] Hive Contrib ...................................... SUCCESS [1.080s]
          [INFO] Hive HBase Handler ................................ SUCCESS [2.622s]
          [INFO] Hive HCatalog ..................................... SUCCESS [0.498s]
          [INFO] Hive HCatalog Core ................................ SUCCESS [3.663s]
          [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [0.878s]
          [INFO] Hive HCatalog Server Extensions ................... SUCCESS [1.492s]
          [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [1.224s]
          [INFO] Hive HCatalog Webhcat ............................. SUCCESS [8.919s]
          [INFO] Hive HCatalog HBase Storage Handler ............... SUCCESS [4.152s]
          [INFO] Hive HWI .......................................... SUCCESS [0.660s]
          [INFO] Hive ODBC ......................................... SUCCESS [0.136s]
          [INFO] Hive Shims Aggregator ............................. SUCCESS [0.208s]
          [INFO] Hive TestUtils .................................... SUCCESS [0.251s]
          [INFO] Hive Packaging .................................... SUCCESS [0.297s]
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD SUCCESS
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 2:48.139s
          [INFO] Finished at: Mon Nov 25 20:52:30 EST 2013
          [INFO] Final Memory: 62M/372M
          [INFO] ------------------------------------------------------------------------
          + mvn -B test -Dmaven.repo.local=/data/hive-ptest/working/maven -Dtest=TestDummy
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Build Order:
          [INFO] 
          [INFO] Hive
          [INFO] Hive Ant Utilities
          [INFO] Hive Shims Common
          [INFO] Hive Shims 0.20
          [INFO] Hive Shims Secure Common
          [INFO] Hive Shims 0.20S
          [INFO] Hive Shims 0.23
          [INFO] Hive Shims
          [INFO] Hive Common
          [INFO] Hive Serde
          [INFO] Hive Metastore
          [INFO] Hive Query Language
          [INFO] Hive Service
          [INFO] Hive JDBC
          [INFO] Hive Beeline
          [INFO] Hive CLI
          [INFO] Hive Contrib
          [INFO] Hive HBase Handler
          [INFO] Hive HCatalog
          [INFO] Hive HCatalog Core
          [INFO] Hive HCatalog Pig Adapter
          [INFO] Hive HCatalog Server Extensions
          [INFO] Hive HCatalog Webhcat Java Client
          [INFO] Hive HCatalog Webhcat
          [INFO] Hive HCatalog HBase Storage Handler
          [INFO] Hive HWI
          [INFO] Hive ODBC
          [INFO] Hive Shims Aggregator
          [INFO] Hive TestUtils
          [INFO] Hive Packaging
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
          [INFO] Executed tasks
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
          [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 4 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Serde 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
          [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Metastore 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
          ANTLR Parser Generator  Version 3.4
          Grammar /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g is up to date - build skipped
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
          [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          [INFO] 
          [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
          [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
          DataNucleus Enhancer : Classpath
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
          >>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
          >>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
          >>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          >>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
          >>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
          >>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar
          >>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
          >>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/classes
          >>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
          >>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
          >>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
          >>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
          >>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
          >>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
          >>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
          >>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
          >>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
          >>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
          >>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
          >>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
          >>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
          >>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
          >>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
          >>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar
          >>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
          >>  /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
          >>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
          >>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
          >>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
          >>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
          >>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
          >>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
          >>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
          >>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
          >>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
          DataNucleus Enhancer completed with success for 25 classes. Timings : input=663 ms, enhance=374 ms, total=1037 ms. Consult the log for full details
          
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Query Language 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
          Generating vector expression code
          Generating vector expression test code
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
          ANTLR Parser Generator  Version 3.4
          Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g is up to date - build skipped
          Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g is up to date - build skipped
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
          [INFO] Compiling 6 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources) @ hive-exec ---
          [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-exec ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 4 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ql/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-exec ---
          [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-exec ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Service 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive JDBC 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-jdbc ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-jdbc ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-jdbc ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-jdbc ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Beeline 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-beeline ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 2 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-beeline ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-beeline ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-beeline ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-beeline ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-beeline ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-beeline ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive CLI 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-cli ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-cli ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-cli ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-cli ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-cli ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Contrib 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-contrib ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contrib ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-contrib ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-contrib ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler ---
          [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-handler ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
          [INFO] Executed tasks
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-core ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-core ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-core ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-core ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-core ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-core ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Pig Adapter 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-pig-adapter ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-pig-adapter ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-pig-adapter ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-pig-adapter ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-pig-adapter ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-pig-adapter ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-pig-adapter ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Server Extensions 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-server-extensions ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-server-extensions ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-server-extensions ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-server-extensions ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-server-extensions ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-server-extensions ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-server-extensions ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Webhcat Java Client 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat-java-client ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat-java-client ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat-java-client ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat-java-client ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat-java-client ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat-java-client ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat-java-client ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog Webhcat 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-javadoc-plugin:2.4:javadoc (resourcesdoc.xml) @ hive-webhcat ---
          [INFO] Setting property: classpath.resource.loader.class => 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'.
          [INFO] Setting property: velocimacro.messages.on => 'false'.
          [INFO] Setting property: resource.loader => 'classpath'.
          [INFO] Setting property: resource.manager.logwhenfound => 'false'.
          [INFO] ************************************************************** 
          [INFO] Starting Jakarta Velocity v1.4
          [INFO] RuntimeInstance initializing.
          [INFO] Default Properties File: org/apache/velocity/runtime/defaults/velocity.properties
          [INFO] Default ResourceManager initializing. (class org.apache.velocity.runtime.resource.ResourceManagerImpl)
          [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader
          [INFO] ClasspathResourceLoader : initialization starting.
          [INFO] ClasspathResourceLoader : initialization complete.
          [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.resource.ResourceCacheImpl)
          [INFO] Default ResourceManager initialization complete.
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Literal
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Include
          [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Foreach
          [INFO] Created: 20 parsers.
          [INFO] Velocimacro : initialization starting.
          [INFO] Velocimacro : adding VMs from VM library template : VM_global_library.vm
          [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in any resource loader.
          [INFO] Velocimacro : error using  VM library template VM_global_library.vm : org.apache.velocity.exception.ResourceNotFoundException: Unable to find resource 'VM_global_library.vm'
          [INFO] Velocimacro :  VM library template macro registration complete.
          [INFO] Velocimacro : allowInline = true : VMs can be defined inline in templates
          [INFO] Velocimacro : allowInlineToOverride = false : VMs defined inline may NOT replace previous VM definitions
          [INFO] Velocimacro : allowInlineLocal = false : VMs defined inline will be  global in scope if allowed.
          [INFO] Velocimacro : initialization complete.
          [INFO] Velocity successfully started.
          Loading source files for package org.apache.hive.hcatalog.templeton...
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleExceptionMapper.java]
          [parsing completed 26ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JsonBuilder.java]
          [parsing completed 8ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JobItemBean.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.java]
          [parsing completed 11ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelegator.java]
          [parsing completed 19ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImpl.java]
          [parsing completed 22ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecBean.java]
          [parsing completed 6ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TempletonDelegator.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDelegator.java]
          [parsing completed 14ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DatabaseDesc.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteBean.java]
          [parsing completed 4ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.java]
          [parsing completed 19ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java]
          [parsing completed 124ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BadParam.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ColumnDesc.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PartitionDesc.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CatchallExceptionMapper.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.java]
          [parsing completed 44ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java]
          [parsing completed 2ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBean.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteDelegator.java]
          [parsing completed 10ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/GroupPermissionsDesc.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatException.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueException.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableLikeDesc.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableDesc.java]
          [parsing completed 12ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/WadlConfig.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BusyException.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecService.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/NotAuthorizedException.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/EnqueueBean.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleWebException.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java]
          [parsing completed 11ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SecureProxySupport.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/MaxByteArrayOutputStream.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ProxyUserSupport.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/UgiFactory.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CallbackFailedException.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TablePropertyDesc.java]
          [parsing completed 0ms]
          Loading source files for package org.apache.hive.hcatalog.templeton.tool...
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage.java]
          [parsing completed 14ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullRecordReader.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/PigJobIDParser.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonStorage.java]
          [parsing completed 4ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobIDParser.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonUtils.java]
          [parsing completed 18ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStorage.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/DelegationTokenCache.java]
          [parsing completed 6ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JarJobIDParser.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobSubmissionConstants.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperCleanup.java]
          [parsing completed 5ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobStateTracker.java]
          [parsing completed 13ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NotFoundException.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/SingleInputFormat.java]
          [parsing completed 0ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LogRetriever.java]
          [parsing completed 10ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullSplit.java]
          [parsing completed 2ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSCleanup.java]
          [parsing completed 4ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java]
          [parsing completed 7ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HiveJobIDParser.java]
          [parsing completed 1ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LaunchMapper.java]
          [parsing completed 9ms]
          [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialExecService.java]
          [parsing completed 2ms]
          Constructing Javadoc information...
          [search path for source files: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java]
          [search path for class files: /usr/java/jdk1.6.0_34/jre/lib/resources.jar,/usr/java/jdk1.6.0_34/jre/lib/rt.jar,/usr/java/jdk1.6.0_34/jre/lib/sunrsasign.jar,/usr/java/jdk1.6.0_34/jre/lib/jsse.jar,/usr/java/jdk1.6.0_34/jre/lib/jce.jar,/usr/java/jdk1.6.0_34/jre/lib/charsets.jar,/usr/java/jdk1.6.0_34/jre/lib/modules/jdk.boot.jar,/usr/java/jdk1.6.0_34/jre/classes,/usr/java/jdk1.6.0_34/jre/lib/ext/localedata.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunpkcs11.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunjce_provider.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/dnsns.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar,/data/hive-ptest/working/maven/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar,/data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar,/data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/mail-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/activation.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar,/data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar,/data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar,/data/hive-ptest/working/maven/org/eclipse/jetty/aggregate/jetty-all-server/7.6.0.v20120127/jetty-all-server-7.6.0.v20120127.jar,/data/hive-ptest/working/maven/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar,/data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar,/data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar,/data/hive-ptest/working/maven/com/googlecode/javaewah/JavaEWAH/0.3.2/JavaEWAH-0.3.2.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes,/data/hive-ptest/working/maven/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar,/data/hive-ptest/working/maven/org/apache/velocity/velocity/1.5/velocity-1.5.jar,/data/hive-ptest/working/maven/com/jolbox/bonecp/0.7.1.RELEASE/bonecp-0.7.1.RELEASE.jar,/data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar,/data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar,/data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar,/data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar,/data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/ST4/4.0.4/ST4-4.0.4.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-exec/1.1/commons-exec-1.1.jar,/data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar,/data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar,/data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar,/data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar,/data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar,/data/hive-ptest/working/maven/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar,/data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar,/usr/java/jdk1.6.0_34/jre/../lib/tools.jar,/data/hive-ptest/working/maven/org/apache/ant/ant/1.9.1/ant-1.9.1.jar,/data/hive-ptest/working/maven/io/netty/netty/3.4.0.Final/netty-3.4.0.Final.jar,/data/hive-ptest/working/maven/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar,/data/hive-ptest/working/maven/com/sun/jersey/contribs/wadl-resourcedoc-doclet/1.4/wadl-resourcedoc-doclet-1.4.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-mapred/1.7.5/avro-mapred-1.7.5.jar,/data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar,/data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar,/data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar,/data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar,/data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar,/data/hive-ptest/working/apache-svn-trunk-source/common/target/classes,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5-tests.jar,/data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/geronimo-annotation_1.0_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5.jar,/data/hive-ptest/working/maven/javolution/javolution/5.5.1/javolution-5.5.1.jar,/data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-servlet/1.14/jersey-servlet-1.14.jar,/data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar,/data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar,/data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar,/data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar,/data/hive-ptest/working/maven/com/esotericsoftware/kryo/kryo/2.22/kryo-2.22.jar,/data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar,/data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/cli/target/classes,/data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar,/data/hive-ptest/working/maven/org/codehaus/groovy/groovy-all/2.1.6/groovy-all-2.1.6.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_cs.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_de_DE.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_es.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_fr.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_hu.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_it.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ja_JP.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ko_KR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pl.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pt_BR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ru.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_CN.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_TW.jar,/data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes,/data/hive-ptest/working/maven/asm/asm-commons/3.1/asm-commons-3.1.jar,/data/hive-ptest/working/maven/org/iq80/snappy/snappy/0.2/snappy-0.2.jar,/data/hive-ptest/working/apache-svn-trunk-source/service/target/classes,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/activation.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jsr173_1.0_api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb1-impl.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-tools/1.2.1/hadoop-tools-1.2.1.jar,/data/hive-ptest/working/maven/asm/asm-tree/3.1/asm-tree-3.1.jar,/data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.2/paranamer-2.2.jar,/data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar,/data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar,/data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar,/data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jaspic_1.0_spec/1.0/geronimo-jaspic_1.0_spec-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar,/data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar,/data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar,/data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/classes,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jta_1.1_spec/1.1.1/geronimo-jta_1.1_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/json/json/20090211/json-20090211.jar,/data/hive-ptest/working/maven/org/apache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar,/data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar,/data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes,/data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar,/data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar]
          [loading javax/ws/rs/core/Response.class(javax/ws/rs/core:Response.class)]
          [loading javax/ws/rs/ext/ExceptionMapper.class(javax/ws/rs/ext:ExceptionMapper.class)]
          [loading javax/ws/rs/ext/Provider.class(javax/ws/rs/ext:Provider.class)]
          [loading java/io/IOException.class(java/io:IOException.class)]
          [loading java/util/Map.class(java/util:Map.class)]
          [loading java/util/HashMap.class(java/util:HashMap.class)]
          [loading javax/ws/rs/core/MediaType.class(javax/ws/rs/core:MediaType.class)]
          [loading org/codehaus/jackson/map/ObjectMapper.class(org/codehaus/jackson/map:ObjectMapper.class)]
          [loading java/lang/Throwable.class(java/lang:Throwable.class)]
          [loading java/io/Serializable.class(java/io:Serializable.class)]
          [loading java/lang/Object.class(java/lang:Object.class)]
          [loading java/lang/String.class(java/lang:String.class)]
          [loading java/io/ByteArrayOutputStream.class(java/io:ByteArrayOutputStream.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes/org/apache/hadoop/hive/ql/ErrorMsg.class]
          [loading org/eclipse/jetty/http/HttpStatus.class(org/eclipse/jetty/http:HttpStatus.class)]
          [loading java/lang/Integer.class(java/lang:Integer.class)]
          [loading org/apache/hadoop/mapred/JobStatus.class(org/apache/hadoop/mapred:JobStatus.class)]
          [loading org/apache/hadoop/mapred/JobProfile.class(org/apache/hadoop/mapred:JobProfile.class)]
          [loading java/lang/Long.class(java/lang:Long.class)]
          [loading java/util/ArrayList.class(java/util:ArrayList.class)]
          [loading java/util/List.class(java/util:List.class)]
          [loading org/apache/commons/logging/Log.class(org/apache/commons/logging:Log.class)]
          [loading org/apache/commons/logging/LogFactory.class(org/apache/commons/logging:LogFactory.class)]
          [loading org/apache/hadoop/conf/Configuration.class(org/apache/hadoop/conf:Configuration.class)]
          [loading java/lang/Enum.class(java/lang:Enum.class)]
          [loading java/lang/Comparable.class(java/lang:Comparable.class)]
          [loading java/lang/Exception.class(java/lang:Exception.class)]
          [loading java/io/FileNotFoundException.class(java/io:FileNotFoundException.class)]
          [loading java/net/URISyntaxException.class(java/net:URISyntaxException.class)]
          [loading org/apache/commons/exec/ExecuteException.class(org/apache/commons/exec:ExecuteException.class)]
          [loading java/security/PrivilegedExceptionAction.class(java/security:PrivilegedExceptionAction.class)]
          [loading org/apache/hadoop/fs/Path.class(org/apache/hadoop/fs:Path.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/conf/HiveConf.class]
          [loading org/apache/hadoop/security/UserGroupInformation.class(org/apache/hadoop/security:UserGroupInformation.class)]
          [loading org/apache/hadoop/util/StringUtils.class(org/apache/hadoop/util:StringUtils.class)]
          [loading org/apache/hadoop/util/ToolRunner.class(org/apache/hadoop/util:ToolRunner.class)]
          [loading java/io/File.class(java/io:File.class)]
          [loading java/net/URL.class(java/net:URL.class)]
          [loading org/apache/hadoop/util/VersionInfo.class(org/apache/hadoop/util:VersionInfo.class)]
          [loading java/lang/Iterable.class(java/lang:Iterable.class)]
          [loading org/apache/hadoop/io/Writable.class(org/apache/hadoop/io:Writable.class)]
          [loading java/lang/InterruptedException.class(java/lang:InterruptedException.class)]
          [loading java/io/BufferedReader.class(java/io:BufferedReader.class)]
          [loading java/io/InputStream.class(java/io:InputStream.class)]
          [loading java/io/InputStreamReader.class(java/io:InputStreamReader.class)]
          [loading java/io/OutputStream.class(java/io:OutputStream.class)]
          [loading java/io/PrintWriter.class(java/io:PrintWriter.class)]
          [loading java/util/Map$Entry.class(java/util:Map$Entry.class)]
          [loading java/util/concurrent/Semaphore.class(java/util/concurrent:Semaphore.class)]
          [loading org/apache/commons/exec/CommandLine.class(org/apache/commons/exec:CommandLine.class)]
          [loading org/apache/commons/exec/DefaultExecutor.class(org/apache/commons/exec:DefaultExecutor.class)]
          [loading org/apache/commons/exec/ExecuteWatchdog.class(org/apache/commons/exec:ExecuteWatchdog.class)]
          [loading org/apache/commons/exec/PumpStreamHandler.class(org/apache/commons/exec:PumpStreamHandler.class)]
          [loading org/apache/hadoop/util/Shell.class(org/apache/hadoop/util:Shell.class)]
          [loading java/lang/Thread.class(java/lang:Thread.class)]
          [loading java/lang/Runnable.class(java/lang:Runnable.class)]
          [loading org/apache/hadoop/mapred/JobID.class(org/apache/hadoop/mapred:JobID.class)]
          [loading java/util/Arrays.class(java/util:Arrays.class)]
          [loading javax/xml/bind/annotation/XmlRootElement.class(javax/xml/bind/annotation:XmlRootElement.class)]
          [loading java/net/InetAddress.class(java/net:InetAddress.class)]
          [loading java/net/UnknownHostException.class(java/net:UnknownHostException.class)]
          [loading java/text/MessageFormat.class(java/text:MessageFormat.class)]
          [loading java/util/Collections.class(java/util:Collections.class)]
          [loading java/util/regex/Matcher.class(java/util/regex:Matcher.class)]
          [loading java/util/regex/Pattern.class(java/util/regex:Pattern.class)]
          [loading javax/servlet/http/HttpServletRequest.class(javax/servlet/http:HttpServletRequest.class)]
          [loading javax/ws/rs/DELETE.class(javax/ws/rs:DELETE.class)]
          [loading javax/ws/rs/FormParam.class(javax/ws/rs:FormParam.class)]
          [loading javax/ws/rs/GET.class(javax/ws/rs:GET.class)]
          [loading javax/ws/rs/POST.class(javax/ws/rs:POST.class)]
          [loading javax/ws/rs/PUT.class(javax/ws/rs:PUT.class)]
          [loading javax/ws/rs/Path.class(javax/ws/rs:Path.class)]
          [loading javax/ws/rs/PathParam.class(javax/ws/rs:PathParam.class)]
          [loading javax/ws/rs/Produces.class(javax/ws/rs:Produces.class)]
          [loading javax/ws/rs/QueryParam.class(javax/ws/rs:QueryParam.class)]
          [loading javax/ws/rs/core/Context.class(javax/ws/rs/core:Context.class)]
          [loading javax/ws/rs/core/SecurityContext.class(javax/ws/rs/core:SecurityContext.class)]
          [loading javax/ws/rs/core/UriInfo.class(javax/ws/rs/core:UriInfo.class)]
          [loading org/apache/hadoop/security/authentication/client/PseudoAuthenticator.class(org/apache/hadoop/security/authentication/client:PseudoAuthenticator.class)]
          [loading com/sun/jersey/api/NotFoundException.class(com/sun/jersey/api:NotFoundException.class)]
          [loading java/net/URI.class(java/net:URI.class)]
          [loading org/apache/commons/lang/StringUtils.class(org/apache/commons/lang:StringUtils.class)]
          [loading org/apache/hadoop/fs/FileStatus.class(org/apache/hadoop/fs:FileStatus.class)]
          [loading org/apache/hadoop/fs/FileSystem.class(org/apache/hadoop/fs:FileSystem.class)]
          [loading java/util/Date.class(java/util:Date.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceAudience.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes/org/apache/hive/hcatalog/common/HCatUtil.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceAudience$Private.class]
          [loading com/sun/jersey/api/wadl/config/WadlGeneratorConfig.class(com/sun/jersey/api/wadl/config:WadlGeneratorConfig.class)]
          [loading com/sun/jersey/api/wadl/config/WadlGeneratorDescription.class(com/sun/jersey/api/wadl/config:WadlGeneratorDescription.class)]
          [loading com/sun/jersey/server/wadl/generators/resourcedoc/WadlGeneratorResourceDocSupport.class(com/sun/jersey/server/wadl/generators/resourcedoc:WadlGeneratorResourceDocSupport.class)]
          [loading com/sun/jersey/api/core/PackagesResourceConfig.class(com/sun/jersey/api/core:PackagesResourceConfig.class)]
          [loading com/sun/jersey/spi/container/servlet/ServletContainer.class(com/sun/jersey/spi/container/servlet:ServletContainer.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceStability.class]
          [loading org/apache/hadoop/hdfs/web/AuthFilter.class(org/apache/hadoop/hdfs/web:AuthFilter.class)]
          [loading org/apache/hadoop/util/GenericOptionsParser.class(org/apache/hadoop/util:GenericOptionsParser.class)]
          [loading org/eclipse/jetty/rewrite/handler/RedirectPatternRule.class(org/eclipse/jetty/rewrite/handler:RedirectPatternRule.class)]
          [loading org/eclipse/jetty/rewrite/handler/RewriteHandler.class(org/eclipse/jetty/rewrite/handler:RewriteHandler.class)]
          [loading org/eclipse/jetty/server/Handler.class(org/eclipse/jetty/server:Handler.class)]
          [loading org/eclipse/jetty/server/Server.class(org/eclipse/jetty/server:Server.class)]
          [loading org/eclipse/jetty/server/handler/HandlerList.class(org/eclipse/jetty/server/handler:HandlerList.class)]
          [loading org/eclipse/jetty/servlet/FilterHolder.class(org/eclipse/jetty/servlet:FilterHolder.class)]
          [loading org/eclipse/jetty/servlet/FilterMapping.class(org/eclipse/jetty/servlet:FilterMapping.class)]
          [loading org/eclipse/jetty/servlet/ServletContextHandler.class(org/eclipse/jetty/servlet:ServletContextHandler.class)]
          [loading org/eclipse/jetty/servlet/ServletHolder.class(org/eclipse/jetty/servlet:ServletHolder.class)]
          [loading org/slf4j/bridge/SLF4JBridgeHandler.class(org/slf4j/bridge:SLF4JBridgeHandler.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceAudience$LimitedPrivate.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceStability$Unstable.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes/org/apache/hadoop/hive/metastore/api/MetaException.class]
          [loading org/apache/hadoop/io/Text.class(org/apache/hadoop/io:Text.class)]
          [loading org/apache/hadoop/security/Credentials.class(org/apache/hadoop/security:Credentials.class)]
          [loading org/apache/hadoop/security/token/Token.class(org/apache/hadoop/security/token:Token.class)]
          [loading org/apache/thrift/TException.class(org/apache/thrift:TException.class)]
          [loading java/io/Closeable.class(java/io:Closeable.class)]
          [loading java/io/Flushable.class(java/io:Flushable.class)]
          [loading org/apache/hadoop/security/Groups.class(org/apache/hadoop/security:Groups.class)]
          [loading java/util/HashSet.class(java/util:HashSet.class)]
          [loading java/util/Set.class(java/util:Set.class)]
          [loading java/util/concurrent/ConcurrentHashMap.class(java/util/concurrent:ConcurrentHashMap.class)]
          [loading java/io/UnsupportedEncodingException.class(java/io:UnsupportedEncodingException.class)]
          [loading org/apache/zookeeper/CreateMode.class(org/apache/zookeeper:CreateMode.class)]
          [loading org/apache/zookeeper/KeeperException.class(org/apache/zookeeper:KeeperException.class)]
          [loading org/apache/zookeeper/WatchedEvent.class(org/apache/zookeeper:WatchedEvent.class)]
          [loading org/apache/zookeeper/Watcher.class(org/apache/zookeeper:Watcher.class)]
          [loading org/apache/zookeeper/ZooDefs.class(org/apache/zookeeper:ZooDefs.class)]
          [loading org/apache/zookeeper/ZooDefs$Ids.class(org/apache/zookeeper:ZooDefs$Ids.class)]
          [loading org/apache/zookeeper/ZooKeeper.class(org/apache/zookeeper:ZooKeeper.class)]
          [loading org/apache/hadoop/io/NullWritable.class(org/apache/hadoop/io:NullWritable.class)]
          [loading org/apache/hadoop/mapreduce/InputSplit.class(org/apache/hadoop/mapreduce:InputSplit.class)]
          [loading org/apache/hadoop/mapreduce/RecordReader.class(org/apache/hadoop/mapreduce:RecordReader.class)]
          [loading org/apache/hadoop/mapreduce/TaskAttemptContext.class(org/apache/hadoop/mapreduce:TaskAttemptContext.class)]
          [loading java/net/URLConnection.class(java/net:URLConnection.class)]
          [loading java/util/Collection.class(java/util:Collection.class)]
          [loading javax/ws/rs/core/UriBuilder.class(javax/ws/rs/core:UriBuilder.class)]
          [loading java/io/OutputStreamWriter.class(java/io:OutputStreamWriter.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceStability$Evolving.class]
          [loading org/apache/zookeeper/data/Stat.class(org/apache/zookeeper/data:Stat.class)]
          [loading org/apache/hadoop/mapreduce/InputFormat.class(org/apache/hadoop/mapreduce:InputFormat.class)]
          [loading org/apache/hadoop/mapreduce/JobContext.class(org/apache/hadoop/mapreduce:JobContext.class)]
          [loading org/apache/hadoop/mapred/JobClient.class(org/apache/hadoop/mapred:JobClient.class)]
          [loading org/apache/hadoop/mapred/JobConf.class(org/apache/hadoop/mapred:JobConf.class)]
          [loading org/apache/hadoop/mapred/RunningJob.class(org/apache/hadoop/mapred:RunningJob.class)]
          [loading java/io/DataInput.class(java/io:DataInput.class)]
          [loading java/io/DataOutput.class(java/io:DataOutput.class)]
          [loading org/apache/hadoop/conf/Configured.class(org/apache/hadoop/conf:Configured.class)]
          [loading org/apache/hadoop/fs/permission/FsPermission.class(org/apache/hadoop/fs/permission:FsPermission.class)]
          [loading org/apache/hadoop/mapreduce/Job.class(org/apache/hadoop/mapreduce:Job.class)]
          [loading org/apache/hadoop/mapreduce/JobID.class(org/apache/hadoop/mapreduce:JobID.class)]
          [loading org/apache/hadoop/mapreduce/lib/output/NullOutputFormat.class(org/apache/hadoop/mapreduce/lib/output:NullOutputFormat.class)]
          [loading org/apache/hadoop/mapreduce/security/token/delegation/DelegationTokenIdentifier.class(org/apache/hadoop/mapreduce/security/token/delegation:DelegationTokenIdentifier.class)]
          [loading org/apache/hadoop/util/Tool.class(org/apache/hadoop/util:Tool.class)]
          [loading org/apache/hadoop/conf/Configurable.class(org/apache/hadoop/conf:Configurable.class)]
          [loading java/lang/ClassNotFoundException.class(java/lang:ClassNotFoundException.class)]
          [loading org/apache/hadoop/mapreduce/Mapper.class(org/apache/hadoop/mapreduce:Mapper.class)]
          [loading java/util/Iterator.class(java/util:Iterator.class)]
          [loading java/util/LinkedList.class(java/util:LinkedList.class)]
          [loading java/util/concurrent/ExecutorService.class(java/util/concurrent:ExecutorService.class)]
          [loading java/util/concurrent/Executors.class(java/util/concurrent:Executors.class)]
          [loading java/util/concurrent/TimeUnit.class(java/util/concurrent:TimeUnit.class)]
          [loading org/apache/hadoop/mapreduce/Mapper$Context.class(org/apache/hadoop/mapreduce:Mapper$Context.class)]
          [loading java/lang/Process.class(java/lang:Process.class)]
          [loading java/lang/StringBuilder.class(java/lang:StringBuilder.class)]
          [loading java/lang/ProcessBuilder.class(java/lang:ProcessBuilder.class)]
          [loading java/lang/annotation/Target.class(java/lang/annotation:Target.class)]
          [loading java/lang/annotation/ElementType.class(java/lang/annotation:ElementType.class)]
          [loading java/lang/annotation/Retention.class(java/lang/annotation:Retention.class)]
          [loading java/lang/annotation/RetentionPolicy.class(java/lang/annotation:RetentionPolicy.class)]
          [loading java/lang/annotation/Annotation.class(java/lang/annotation:Annotation.class)]
          [loading java/lang/SuppressWarnings.class(java/lang:SuppressWarnings.class)]
          [loading java/lang/Override.class(java/lang:Override.class)]
          [loading javax/ws/rs/HttpMethod.class(javax/ws/rs:HttpMethod.class)]
          [loading java/lang/Deprecated.class(java/lang:Deprecated.class)]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$3.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatDelegator$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/LauncherDelegator$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$2.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatException$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/LogRetriever$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$2.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonUtils$1.class]
          [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/HDFSStorage$1.class]
          [done in 7781 ms]
          8 warnings
          [WARNING] Javadoc Warnings
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java:26: package org.apache.hadoop.hive.shims.HadoopShims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.HadoopShims.WebHCatJTShim;
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java:27: package org.apache.hadoop.hive.shims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.ShimLoader;
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java:25: package org.apache.hadoop.hive.shims.HadoopShims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.HadoopShims.WebHCatJTShim;
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java:26: package org.apache.hadoop.hive.shims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.ShimLoader;
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java:72: cannot find symbol
          [WARNING] symbol  : class WebHCatJTShim
          [WARNING] location: class org.apache.hive.hcatalog.templeton.StatusDelegator
          [WARNING] static QueueStatusBean makeStatus(WebHCatJTShim tracker,
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java:25: package org.apache.hadoop.hive.shims.HadoopShims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.HadoopShims.WebHCatJTShim;
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java:26: package org.apache.hadoop.hive.shims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.ShimLoader;
          [WARNING] ^
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java:37: package org.apache.hadoop.hive.shims does not exist
          [WARNING] import org.apache.hadoop.hive.shims.ShimLoader;
          [WARNING] ^
          [WARNING] Nov 25, 2013 8:53:25 PM com.sun.jersey.wadl.resourcedoc.ResourceDoclet start
          [WARNING] INFO: Wrote /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/resourcedoc.xml
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HCatalog HBase Storage Handler 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-hbase-storage-handler ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/gen-java added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-storage-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-storage-handler ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-storage-handler ---
          [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-storage-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-storage-handler ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-storage-handler ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-storage-handler ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HWI 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hwi ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hwi ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi ---
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive ODBC 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
          [INFO] Executed tasks
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Aggregator 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
          [INFO] Executed tasks
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive TestUtils 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-testutils ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils ---
          [INFO] Nothing to compile - all classes are up to date
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-testutils ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils ---
          [INFO] No tests to run.
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Packaging 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-packaging ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-packaging ---
          [INFO] Executing tasks
          
          main:
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp
             [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf
          [INFO] Executed tasks
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive .............................................. SUCCESS [2.037s]
          [INFO] Hive Ant Utilities ................................ SUCCESS [3.444s]
          [INFO] Hive Shims Common ................................. SUCCESS [1.184s]
          [INFO] Hive Shims 0.20 ................................... SUCCESS [1.285s]
          [INFO] Hive Shims Secure Common .......................... SUCCESS [0.847s]
          [INFO] Hive Shims 0.20S .................................. SUCCESS [0.374s]
          [INFO] Hive Shims 0.23 ................................... SUCCESS [0.942s]
          [INFO] Hive Shims ........................................ SUCCESS [0.202s]
          [INFO] Hive Common ....................................... SUCCESS [3.382s]
          [INFO] Hive Serde ........................................ SUCCESS [0.859s]
          [INFO] Hive Metastore .................................... SUCCESS [5.863s]
          [INFO] Hive Query Language ............................... SUCCESS [12.074s]
          [INFO] Hive Service ...................................... SUCCESS [0.366s]
          [INFO] Hive JDBC ......................................... SUCCESS [0.364s]
          [INFO] Hive Beeline ...................................... SUCCESS [0.445s]
          [INFO] Hive CLI .......................................... SUCCESS [0.706s]
          [INFO] Hive Contrib ...................................... SUCCESS [0.807s]
          [INFO] Hive HBase Handler ................................ SUCCESS [1.305s]
          [INFO] Hive HCatalog ..................................... SUCCESS [0.451s]
          [INFO] Hive HCatalog Core ................................ SUCCESS [0.870s]
          [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [0.697s]
          [INFO] Hive HCatalog Server Extensions ................... SUCCESS [0.816s]
          [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [0.637s]
          [INFO] Hive HCatalog Webhcat ............................. SUCCESS [10.331s]
          [INFO] Hive HCatalog HBase Storage Handler ............... SUCCESS [0.605s]
          [INFO] Hive HWI .......................................... SUCCESS [0.441s]
          [INFO] Hive ODBC ......................................... SUCCESS [0.182s]
          [INFO] Hive Shims Aggregator ............................. SUCCESS [0.127s]
          [INFO] Hive TestUtils .................................... SUCCESS [0.127s]
          [INFO] Hive Packaging .................................... SUCCESS [0.490s]
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD SUCCESS
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 54.864s
          [INFO] Finished at: Mon Nov 25 20:53:28 EST 2013
          [INFO] Final Memory: 38M/102M
          [INFO] ------------------------------------------------------------------------
          + cd itests
          + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Build Order:
          [INFO] 
          [INFO] Hive Integration - Parent
          [INFO] Hive Integration - Custom Serde
          [INFO] Hive Integration - Testing Utilities
          [INFO] Hive Integration - Unit Tests
          [INFO] Hive Integration - HCatalog Unit Tests
          [INFO] Hive Integration - Test Serde
          [INFO] Hive Integration - QFile Tests
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Integration - Parent 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it/0.13.0-SNAPSHOT/hive-it-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Integration - Custom Serde 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-custom-serde ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-custom-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-custom-serde ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-custom-serde ---
          [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-it-custom-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-custom-serde ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-it-custom-serde ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-it-custom-serde ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-custom-serde ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it-custom-serde ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Integration - Testing Utilities 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-util ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/util (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-util ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-util ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-util ---
          [INFO] Compiling 42 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/util/target/classes
          [INFO] -------------------------------------------------------------
          [WARNING] COMPILATION WARNING : 
          [INFO] -------------------------------------------------------------
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 2 warnings 
          [INFO] -------------------------------------------------------------
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR : 
          [INFO] -------------------------------------------------------------
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[45,73] cannot find symbol
          symbol  : variable HIVEJOBPROGRESS
          location: class org.apache.hadoop.hive.conf.HiveConf.ConfVars
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[57,38] cannot find symbol
          symbol  : method getCounters()
          location: class org.apache.hadoop.hive.ql.exec.Operator<capture#395 of ? extends org.apache.hadoop.hive.ql.plan.OperatorDesc>
          [INFO] 2 errors 
          [INFO] -------------------------------------------------------------
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive Integration - Parent ......................... SUCCESS [3.617s]
          [INFO] Hive Integration - Custom Serde ................... SUCCESS [8.790s]
          [INFO] Hive Integration - Testing Utilities .............. FAILURE [5.644s]
          [INFO] Hive Integration - Unit Tests ..................... SKIPPED
          [INFO] Hive Integration - HCatalog Unit Tests ............ SKIPPED
          [INFO] Hive Integration - Test Serde ..................... SKIPPED
          [INFO] Hive Integration - QFile Tests .................... SKIPPED
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 19.687s
          [INFO] Finished at: Mon Nov 25 20:53:50 EST 2013
          [INFO] Final Memory: 25M/59M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-it-util: Compilation failure: Compilation failure:
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[45,73] cannot find symbol
          [ERROR] symbol  : variable HIVEJOBPROGRESS
          [ERROR] location: class org.apache.hadoop.hive.conf.HiveConf.ConfVars
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[57,38] cannot find symbol
          [ERROR] symbol  : method getCounters()
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.Operator<capture#395 of ? extends org.apache.hadoop.hive.ql.plan.OperatorDesc>
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-it-util
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12615765

          Show
          Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12615765/HIVE-5706.5.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/441/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/441/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-441/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . ++ awk '{print $2}' ++ egrep -v '^X|^Performing status on external' ++ svn status --no-ignore + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target + svn update U ql/src/test/queries/clientpositive/insert_into3.q U ql/src/test/results/compiler/plan/input_testxpath.q.xml U ql/src/test/results/compiler/plan/input_part1.q.xml U ql/src/test/results/compiler/plan/input1.q.xml U ql/src/test/results/compiler/plan/input2.q.xml U ql/src/test/results/compiler/plan/input3.q.xml U ql/src/test/results/compiler/plan/input4.q.xml U ql/src/test/results/compiler/plan/input5.q.xml U ql/src/test/results/compiler/plan/input6.q.xml U ql/src/test/results/compiler/plan/input_testxpath2.q.xml U ql/src/test/results/compiler/plan/input7.q.xml U ql/src/test/results/compiler/plan/input8.q.xml U ql/src/test/results/compiler/plan/input_testsequencefile.q.xml U ql/src/test/results/compiler/plan/input9.q.xml U ql/src/test/results/compiler/plan/udf1.q.xml U ql/src/test/results/compiler/plan/input20.q.xml U ql/src/test/results/compiler/plan/udf4.q.xml U ql/src/test/results/compiler/plan/sample1.q.xml U ql/src/test/results/compiler/plan/sample2.q.xml U ql/src/test/results/compiler/plan/udf6.q.xml U ql/src/test/results/compiler/plan/sample3.q.xml U ql/src/test/results/compiler/plan/sample4.q.xml U ql/src/test/results/compiler/plan/sample5.q.xml U ql/src/test/results/compiler/plan/sample6.q.xml U ql/src/test/results/compiler/plan/sample7.q.xml U ql/src/test/results/compiler/plan/groupby1.q.xml U ql/src/test/results/compiler/plan/groupby2.q.xml U ql/src/test/results/compiler/plan/udf_case.q.xml U ql/src/test/results/compiler/plan/groupby3.q.xml U ql/src/test/results/compiler/plan/subq.q.xml U ql/src/test/results/compiler/plan/cast1.q.xml U ql/src/test/results/compiler/plan/groupby4.q.xml U ql/src/test/results/compiler/plan/groupby5.q.xml U ql/src/test/results/compiler/plan/groupby6.q.xml U ql/src/test/results/compiler/plan/join1.q.xml U ql/src/test/results/compiler/plan/join2.q.xml U ql/src/test/results/compiler/plan/join3.q.xml U ql/src/test/results/compiler/plan/join4.q.xml U ql/src/test/results/compiler/plan/join5.q.xml U ql/src/test/results/compiler/plan/join6.q.xml U ql/src/test/results/compiler/plan/case_sensitivity.q.xml U ql/src/test/results/compiler/plan/join7.q.xml U ql/src/test/results/compiler/plan/join8.q.xml U ql/src/test/results/compiler/plan/union.q.xml U ql/src/test/results/compiler/plan/udf_when.q.xml U ql/src/test/results/clientpositive/insert_into3.q.out U ql/src/test/org/apache/hadoop/hive/ql/testutil/OperatorTestUtils.java U ql/src/test/org/apache/hadoop/hive/ql/exec/TestOperators.java U ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorGroupByOperator.java U ql/src/java/org/apache/hadoop/hive/ql/QueryPlan.java U ql/src/java/org/apache/hadoop/hive/ql/parse/MapReduceCompiler.java U ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java U ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java A ql/src/java/org/apache/hadoop/hive/ql/metadata/HiveFatalException.java U ql/src/java/org/apache/hadoop/hive/ql/exec/DemuxOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/mr/MapredLocalTask.java U ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHelper.java U ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHook.java U ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecReducer.java U ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java U ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/AbstractMapJoinOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/CommonJoinOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/MuxOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/GroupByOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/FetchOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/MapJoinOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/SMBMapJoinOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorFileSinkOperator.java U ql/src/java/org/apache/hadoop/hive/ql/exec/OperatorFactory.java U ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java U ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java U ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java U ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/merge/BlockMergeTask.java U ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java U conf/hive-default.xml.template U data/conf/hive-site.xml U common/src/java/org/apache/hadoop/hive/conf/HiveConf.java Fetching external item into 'hcatalog/src/test/e2e/harness' Updated external to revision 1545504. Updated to revision 1545504. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericUnaryOp.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFloor.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFBaseUnary.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFCeil.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFFloor.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFFloorCeilBase.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPNegative.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPPositive.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFPower.java patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorizationContext.java patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFCeil.java patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFFloor.java patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPNegative.java patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPPositive.java patching file ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFPower.java patching file ql/src/test/results/clientpositive/decimal_udf.q.out patching file ql/src/test/results/clientpositive/literal_decimal.q.out patching file ql/src/test/results/clientpositive/udf4.q.out patching file ql/src/test/results/clientpositive/udf7.q.out patching file ql/src/test/results/clientpositive/vectorization_short_regress.q.out patching file ql/src/test/results/clientpositive/vectorized_math_funcs.q.out patching file ql/src/test/results/compiler/plan/udf4.q.xml Hunk #1 succeeded at 591 (offset -16 lines). Hunk #2 succeeded at 669 (offset -16 lines). Hunk #3 succeeded at 679 (offset -16 lines). Hunk #4 succeeded at 704 (offset -16 lines). Hunk #5 succeeded at 729 (offset -16 lines). Hunk #6 succeeded at 758 (offset -16 lines). Hunk #7 succeeded at 818 (offset -16 lines). Hunk #8 succeeded at 875 (offset -16 lines). Hunk #9 succeeded at 904 (offset -16 lines). Hunk #10 succeeded at 914 (offset -16 lines). Hunk #11 succeeded at 939 (offset -16 lines). Hunk #12 succeeded at 978 (offset -16 lines). Hunk #13 succeeded at 1048 (offset -16 lines). + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant --- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure --- [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims --- [INFO] Reading assembly descriptor: src/assemble/uberjar.xml [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion. [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing. Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact. NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic! [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde --- [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde --- [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/metastore/parser/Filter.g [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore --- [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input=607 ms, enhance=944 ms, total=1551 ms. Consult the log for full details [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveParser.g warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5: Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10 As a result, alternative(s) 10 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116: Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7 As a result, alternative(s) 7 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:127:2: Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25: Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25: Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25: Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:108:5: Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:121:5: Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:133:5: Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:144:5: Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:155:5: Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:172:7: Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:524:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1397 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources) @ hive-exec --- [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java added. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-exec --- [INFO] Compiling 143 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-exec --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-exec --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-exec --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-shade-plugin:2.1:shade (build-exec-bundle) @ hive-exec --- [INFO] Excluding org.apache.hive:hive-ant:jar:0.13.0-SNAPSHOT from the shaded jar. [INFO] Excluding org.apache.velocity:velocity:jar:1.5 from the shaded jar. [INFO] Excluding commons-collections:commons-collections:jar:3.1 from the shaded jar. [INFO] Including org.apache.hive:hive-common:jar:0.13.0-SNAPSHOT in the shaded jar. [INFO] Excluding commons-cli:commons-cli:jar:1.2 from the shaded jar. [INFO] Excluding org.apache.hive:hive-metastore:jar:0.13.0-SNAPSHOT from the shaded jar. [INFO] Excluding com.jolbox:bonecp:jar:0.7.1.RELEASE from the shaded jar. [INFO] Excluding org.apache.derby:derby:jar:10.4.2.0 from the shaded jar. [INFO] Excluding org.datanucleus:datanucleus-api-jdo:jar:3.2.1 from the shaded jar. [INFO] Excluding org.datanucleus:datanucleus-rdbms:jar:3.2.1 from the shaded jar. [INFO] Excluding javax.jdo:jdo-api:jar:3.0.1 from the shaded jar. [INFO] Excluding javax.transaction:jta:jar:1.1 from the shaded jar. [INFO] Including org.apache.hive:hive-serde:jar:0.13.0-SNAPSHOT in the shaded jar. [INFO] Including org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT in the shaded jar. [INFO] Including com.esotericsoftware.kryo:kryo:jar:2.22 in the shaded jar. [INFO] Excluding commons-codec:commons-codec:jar:1.4 from the shaded jar. [INFO] Excluding commons-httpclient:commons-httpclient:jar:3.0.1 from the shaded jar. [INFO] Excluding commons-io:commons-io:jar:2.4 from the shaded jar. [INFO] Including commons-lang:commons-lang:jar:2.4 in the shaded jar. [INFO] Excluding commons-logging:commons-logging:jar:1.1.3 from the shaded jar. [INFO] Including javolution:javolution:jar:5.5.1 in the shaded jar. [INFO] Excluding log4j:log4j:jar:1.2.16 from the shaded jar. [INFO] Excluding org.antlr:antlr-runtime:jar:3.4 from the shaded jar. [INFO] Excluding org.antlr:stringtemplate:jar:3.2.1 from the shaded jar. [INFO] Excluding antlr:antlr:jar:2.7.7 from the shaded jar. [INFO] Excluding org.antlr:ST4:jar:4.0.4 from the shaded jar. [INFO] Excluding org.apache.avro:avro:jar:1.7.5 from the shaded jar. [INFO] Excluding com.thoughtworks.paranamer:paranamer:jar:2.3 from the shaded jar. [INFO] Excluding org.xerial.snappy:snappy-java:jar:1.0.5 from the shaded jar. [INFO] Excluding org.apache.avro:avro-mapred:jar:1.7.5 from the shaded jar. [INFO] Excluding org.apache.avro:avro-ipc:jar:1.7.5 from the shaded jar. [INFO] Excluding io.netty:netty:jar:3.4.0.Final from the shaded jar. [INFO] Excluding org.mortbay.jetty:servlet-api:jar:2.5-20081211 from the shaded jar. [INFO] Excluding org.apache.avro:avro-ipc:jar:tests:1.7.5 from the shaded jar. [INFO] Excluding org.apache.ant:ant:jar:1.9.1 from the shaded jar. [INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.1 from the shaded jar. [INFO] Excluding org.apache.commons:commons-compress:jar:1.4.1 from the shaded jar. [INFO] Excluding org.tukaani:xz:jar:1.0 from the shaded jar. [INFO] Excluding org.apache.thrift:libfb303:jar:0.9.0 from the shaded jar. [INFO] Including org.apache.thrift:libthrift:jar:0.9.0 in the shaded jar. [INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.2.5 from the shaded jar. [INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.2.4 from the shaded jar. [INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.5 from the shaded jar. [INFO] Excluding jline:jline:jar:0.9.94 from the shaded jar. [INFO] Excluding org.codehaus.groovy:groovy-all:jar:2.1.6 from the shaded jar. [INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.2 in the shaded jar. [INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.2 in the shaded jar. [INFO] Excluding org.datanucleus:datanucleus-core:jar:3.2.2 from the shaded jar. [INFO] Including com.google.guava:guava:jar:11.0.2 in the shaded jar. [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar. [INFO] Including com.google.protobuf:protobuf-java:jar:2.5.0 in the shaded jar. [INFO] Including com.googlecode.javaewah:JavaEWAH:jar:0.3.2 in the shaded jar. [INFO] Including org.iq80.snappy:snappy:jar:0.2 in the shaded jar. [INFO] Including org.json:json:jar:20090211 in the shaded jar. [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar. [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded jar. [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-core:jar:1.14 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-json:jar:1.14 from the shaded jar. [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar. [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar. [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar. [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar. [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.2 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.2 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-server:jar:1.14 from the shaded jar. [INFO] Excluding asm:asm:jar:3.1 from the shaded jar. [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded jar. [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar. [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar. [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar. [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar. [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar. [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar. [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar. [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar. [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar. [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.5 from the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Service 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service --- [INFO] Compiling 153 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service --- [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive JDBC 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc --- [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-jdbc --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-jdbc --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-jdbc --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-jdbc --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/hive-jdbc-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-jdbc --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/hive-jdbc-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-jdbc/0.13.0-SNAPSHOT/hive-jdbc-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/jdbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-jdbc/0.13.0-SNAPSHOT/hive-jdbc-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Beeline 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-beeline --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/beeline (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-beeline --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-beeline --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/classes [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[28,16] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[29,16] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[31,64] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[44,23] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[37,24] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java/org/apache/hive/beeline/SunSignalHandler.java:[37,5] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-beeline --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-beeline --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-beeline --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-beeline --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/hive-beeline-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-beeline --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/hive-beeline-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-beeline/0.13.0-SNAPSHOT/hive-beeline-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/beeline/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-beeline/0.13.0-SNAPSHOT/hive-beeline-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive CLI 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-cli --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/cli (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-cli --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-cli --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/classes [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[74,16] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[75,16] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[371,5] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[372,5] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[377,27] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[378,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[378,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[383,28] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[378,19] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[439,9] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/RCFileCat.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-cli --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/org/apache/hadoop/hive/cli/TestCliDriverMethods.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-cli --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-cli --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Contrib 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-contrib --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/contrib (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contrib --- [INFO] Compiling 39 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/java/org/apache/hadoop/hive/contrib/udf/example/UDFExampleStructPrint.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-contrib --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/org/apache/hadoop/hive/contrib/serde2/TestRegexSerDe.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-contrib --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-contrib --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler --- [INFO] Compiling 17 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-handler --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-handler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-handler --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog/0.13.0-SNAPSHOT/hive-hcatalog-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-core --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-core --- [INFO] Compiling 144 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-core --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-core --- [INFO] Compiling 67 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-core --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-core --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hcatalog-core --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-core --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Pig Adapter 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-pig-adapter --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-pig-adapter --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-pig-adapter --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-pig-adapter --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-pig-adapter --- [INFO] Compiling 26 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-pig-adapter --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-pig-adapter --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-pig-adapter --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-pig-adapter/0.13.0-SNAPSHOT/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-pig-adapter/0.13.0-SNAPSHOT/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Server Extensions 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-server-extensions --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-server-extensions --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-server-extensions --- [INFO] Compiling 38 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-server-extensions --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-server-extensions --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-server-extensions --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-server-extensions --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-server-extensions --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-server-extensions/0.13.0-SNAPSHOT/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-server-extensions/0.13.0-SNAPSHOT/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Webhcat Java Client 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-webhcat-java-client --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat-java-client --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat-java-client --- [INFO] Compiling 20 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat-java-client --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat-java-client --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat-java-client --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-webhcat-java-client --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-webhcat-java-client --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat-java-client/0.13.0-SNAPSHOT/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat-java-client/0.13.0-SNAPSHOT/hive-webhcat-java-client-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Webhcat 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-webhcat --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat --- [INFO] Compiling 65 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-javadoc-plugin:2.4:javadoc (resourcesdoc.xml) @ hive-webhcat --- [INFO] Setting property: classpath.resource.loader.class => 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'. [INFO] Setting property: velocimacro.messages.on => 'false'. [INFO] Setting property: resource.loader => 'classpath'. [INFO] Setting property: resource.manager.logwhenfound => 'false'. [INFO] ************************************************************** [INFO] Starting Jakarta Velocity v1.4 [INFO] RuntimeInstance initializing. [INFO] Default Properties File: org/apache/velocity/runtime/defaults/velocity.properties [INFO] Default ResourceManager initializing. (class org.apache.velocity.runtime.resource.ResourceManagerImpl) [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader [INFO] ClasspathResourceLoader : initialization starting. [INFO] ClasspathResourceLoader : initialization complete. [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.resource.ResourceCacheImpl) [INFO] Default ResourceManager initialization complete. [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Literal [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Include [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Foreach [INFO] Created: 20 parsers. [INFO] Velocimacro : initialization starting. [INFO] Velocimacro : adding VMs from VM library template : VM_global_library.vm [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in any resource loader. [INFO] Velocimacro : error using VM library template VM_global_library.vm : org.apache.velocity.exception.ResourceNotFoundException: Unable to find resource 'VM_global_library.vm' [INFO] Velocimacro : VM library template macro registration complete. [INFO] Velocimacro : allowInline = true : VMs can be defined inline in templates [INFO] Velocimacro : allowInlineToOverride = false : VMs defined inline may NOT replace previous VM definitions [INFO] Velocimacro : allowInlineLocal = false : VMs defined inline will be global in scope if allowed. [INFO] Velocimacro : initialization complete. [INFO] Velocity successfully started. Loading source files for package org.apache.hive.hcatalog.templeton... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleExceptionMapper.java] [parsing completed 27ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JsonBuilder.java] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JobItemBean.java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelegator.java] [parsing completed 30ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImpl.java] [parsing completed 19ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecBean.java] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.java] [parsing completed 14ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TempletonDelegator.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDelegator.java] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DatabaseDesc.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteBean.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.java] [parsing completed 20ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java] [parsing completed 143ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BadParam.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ColumnDesc.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PartitionDesc.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CatchallExceptionMapper.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.java] [parsing completed 43ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBean.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteDelegator.java] [parsing completed 14ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/GroupPermissionsDesc.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java] [parsing completed 27ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatException.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueException.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableLikeDesc.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableDesc.java] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/WadlConfig.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BusyException.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecService.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/NotAuthorizedException.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/EnqueueBean.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleWebException.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SecureProxySupport.java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/MaxByteArrayOutputStream.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ProxyUserSupport.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/UgiFactory.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CallbackFailedException.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TablePropertyDesc.java] [parsing completed 0ms] Loading source files for package org.apache.hive.hcatalog.templeton.tool... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage.java] [parsing completed 19ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullRecordReader.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/PigJobIDParser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonStorage.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobIDParser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonUtils.java] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStorage.java] [parsing completed 10ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/DelegationTokenCache.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JarJobIDParser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobSubmissionConstants.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperCleanup.java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobStateTracker.java] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NotFoundException.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/SingleInputFormat.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LogRetriever.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullSplit.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSCleanup.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HiveJobIDParser.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LaunchMapper.java] [parsing completed 19ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialExecService.java] [parsing completed 6ms] Constructing Javadoc information... [search path for source files: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java] [search path for class files: /usr/java/jdk1.6.0_34/jre/lib/resources.jar,/usr/java/jdk1.6.0_34/jre/lib/rt.jar,/usr/java/jdk1.6.0_34/jre/lib/sunrsasign.jar,/usr/java/jdk1.6.0_34/jre/lib/jsse.jar,/usr/java/jdk1.6.0_34/jre/lib/jce.jar,/usr/java/jdk1.6.0_34/jre/lib/charsets.jar,/usr/java/jdk1.6.0_34/jre/lib/modules/jdk.boot.jar,/usr/java/jdk1.6.0_34/jre/classes,/usr/java/jdk1.6.0_34/jre/lib/ext/localedata.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunpkcs11.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunjce_provider.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/dnsns.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar,/data/hive-ptest/working/maven/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar,/data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar,/data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/mail-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/activation.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar,/data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar,/data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar,/data/hive-ptest/working/maven/org/eclipse/jetty/aggregate/jetty-all-server/7.6.0.v20120127/jetty-all-server-7.6.0.v20120127.jar,/data/hive-ptest/working/maven/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar,/data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar,/data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar,/data/hive-ptest/working/maven/org/apache/velocity/velocity/1.5/velocity-1.5.jar,/data/hive-ptest/working/maven/com/jolbox/bonecp/0.7.1.RELEASE/bonecp-0.7.1.RELEASE.jar,/data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar,/data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar,/data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar,/data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar,/data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/ST4/4.0.4/ST4-4.0.4.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-exec/1.1/commons-exec-1.1.jar,/data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar,/data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar,/data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar,/data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar,/data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar,/data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar,/usr/java/jdk1.6.0_34/jre/../lib/tools.jar,/data/hive-ptest/working/maven/org/apache/ant/ant/1.9.1/ant-1.9.1.jar,/data/hive-ptest/working/maven/io/netty/netty/3.4.0.Final/netty-3.4.0.Final.jar,/data/hive-ptest/working/maven/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar,/data/hive-ptest/working/maven/com/sun/jersey/contribs/wadl-resourcedoc-doclet/1.4/wadl-resourcedoc-doclet-1.4.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-mapred/1.7.5/avro-mapred-1.7.5.jar,/data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar,/data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar,/data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar,/data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar,/data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar,/data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5-tests.jar,/data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/geronimo-annotation_1.0_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5.jar,/data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-servlet/1.14/jersey-servlet-1.14.jar,/data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar,/data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar,/data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar,/data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar,/data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar,/data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar,/data/hive-ptest/working/maven/org/codehaus/groovy/groovy-all/2.1.6/groovy-all-2.1.6.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_cs.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_de_DE.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_es.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_fr.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_hu.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_it.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ja_JP.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ko_KR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pl.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pt_BR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ru.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_CN.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_TW.jar,/data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/asm/asm-commons/3.1/asm-commons-3.1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/activation.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jsr173_1.0_api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb1-impl.jar,/data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-tools/1.2.1/hadoop-tools-1.2.1.jar,/data/hive-ptest/working/maven/asm/asm-tree/3.1/asm-tree-3.1.jar,/data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.2/paranamer-2.2.jar,/data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar,/data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar,/data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar,/data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jaspic_1.0_spec/1.0/geronimo-jaspic_1.0_spec-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar,/data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar,/data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar,/data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jta_1.1_spec/1.1.1/geronimo-jta_1.1_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/apache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar,/data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar,/data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar,/data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar] [loading javax/ws/rs/core/Response.class(javax/ws/rs/core:Response.class)] [loading javax/ws/rs/ext/ExceptionMapper.class(javax/ws/rs/ext:ExceptionMapper.class)] [loading javax/ws/rs/ext/Provider.class(javax/ws/rs/ext:Provider.class)] [loading java/io/IOException.class(java/io:IOException.class)] [loading java/util/Map.class(java/util:Map.class)] [loading java/util/HashMap.class(java/util:HashMap.class)] [loading javax/ws/rs/core/MediaType.class(javax/ws/rs/core:MediaType.class)] [loading org/codehaus/jackson/map/ObjectMapper.class(org/codehaus/jackson/map:ObjectMapper.class)] [loading java/lang/Throwable.class(java/lang:Throwable.class)] [loading java/io/Serializable.class(java/io:Serializable.class)] [loading java/lang/Object.class(java/lang:Object.class)] [loading java/lang/String.class(java/lang:String.class)] [loading java/io/ByteArrayOutputStream.class(java/io:ByteArrayOutputStream.class)] [loading org/apache/hadoop/hive/ql/ErrorMsg.class(org/apache/hadoop/hive/ql:ErrorMsg.class)] [loading org/eclipse/jetty/http/HttpStatus.class(org/eclipse/jetty/http:HttpStatus.class)] [loading java/lang/Integer.class(java/lang:Integer.class)] [loading org/apache/hadoop/mapred/JobStatus.class(org/apache/hadoop/mapred:JobStatus.class)] [loading org/apache/hadoop/mapred/JobProfile.class(org/apache/hadoop/mapred:JobProfile.class)] [loading java/lang/Long.class(java/lang:Long.class)] [loading java/util/ArrayList.class(java/util:ArrayList.class)] [loading java/util/List.class(java/util:List.class)] [loading org/apache/commons/logging/Log.class(org/apache/commons/logging:Log.class)] [loading org/apache/commons/logging/LogFactory.class(org/apache/commons/logging:LogFactory.class)] [loading org/apache/hadoop/conf/Configuration.class(org/apache/hadoop/conf:Configuration.class)] [loading java/lang/Enum.class(java/lang:Enum.class)] [loading java/lang/Comparable.class(java/lang:Comparable.class)] [loading java/lang/Exception.class(java/lang:Exception.class)] [loading java/io/FileNotFoundException.class(java/io:FileNotFoundException.class)] [loading java/net/URISyntaxException.class(java/net:URISyntaxException.class)] [loading org/apache/commons/exec/ExecuteException.class(org/apache/commons/exec:ExecuteException.class)] [loading java/security/PrivilegedExceptionAction.class(java/security:PrivilegedExceptionAction.class)] [loading org/apache/hadoop/fs/Path.class(org/apache/hadoop/fs:Path.class)] [loading org/apache/hadoop/hive/conf/HiveConf.class(org/apache/hadoop/hive/conf:HiveConf.class)] [loading org/apache/hadoop/security/UserGroupInformation.class(org/apache/hadoop/security:UserGroupInformation.class)] [loading org/apache/hadoop/util/StringUtils.class(org/apache/hadoop/util:StringUtils.class)] [loading org/apache/hadoop/util/ToolRunner.class(org/apache/hadoop/util:ToolRunner.class)] [loading java/io/File.class(java/io:File.class)] [loading java/net/URL.class(java/net:URL.class)] [loading org/apache/hadoop/util/VersionInfo.class(org/apache/hadoop/util:VersionInfo.class)] [loading java/lang/Iterable.class(java/lang:Iterable.class)] [loading org/apache/hadoop/io/Writable.class(org/apache/hadoop/io:Writable.class)] [loading java/lang/InterruptedException.class(java/lang:InterruptedException.class)] [loading java/io/BufferedReader.class(java/io:BufferedReader.class)] [loading java/io/InputStream.class(java/io:InputStream.class)] [loading java/io/InputStreamReader.class(java/io:InputStreamReader.class)] [loading java/io/OutputStream.class(java/io:OutputStream.class)] [loading java/io/PrintWriter.class(java/io:PrintWriter.class)] [loading java/util/Map$Entry.class(java/util:Map$Entry.class)] [loading java/util/concurrent/Semaphore.class(java/util/concurrent:Semaphore.class)] [loading org/apache/commons/exec/CommandLine.class(org/apache/commons/exec:CommandLine.class)] [loading org/apache/commons/exec/DefaultExecutor.class(org/apache/commons/exec:DefaultExecutor.class)] [loading org/apache/commons/exec/ExecuteWatchdog.class(org/apache/commons/exec:ExecuteWatchdog.class)] [loading org/apache/commons/exec/PumpStreamHandler.class(org/apache/commons/exec:PumpStreamHandler.class)] [loading org/apache/hadoop/util/Shell.class(org/apache/hadoop/util:Shell.class)] [loading java/lang/Thread.class(java/lang:Thread.class)] [loading java/lang/Runnable.class(java/lang:Runnable.class)] [loading org/apache/hadoop/hive/shims/HadoopShims.class(org/apache/hadoop/hive/shims:HadoopShims.class)] [loading org/apache/hadoop/hive/shims/HadoopShims$WebHCatJTShim.class(org/apache/hadoop/hive/shims:HadoopShims$WebHCatJTShim.class)] [loading org/apache/hadoop/hive/shims/ShimLoader.class(org/apache/hadoop/hive/shims:ShimLoader.class)] [loading org/apache/hadoop/mapred/JobID.class(org/apache/hadoop/mapred:JobID.class)] [loading java/util/Arrays.class(java/util:Arrays.class)] [loading javax/xml/bind/annotation/XmlRootElement.class(javax/xml/bind/annotation:XmlRootElement.class)] [loading java/net/InetAddress.class(java/net:InetAddress.class)] [loading java/net/UnknownHostException.class(java/net:UnknownHostException.class)] [loading java/text/MessageFormat.class(java/text:MessageFormat.class)] [loading java/util/Collections.class(java/util:Collections.class)] [loading java/util/regex/Matcher.class(java/util/regex:Matcher.class)] [loading java/util/regex/Pattern.class(java/util/regex:Pattern.class)] [loading javax/servlet/http/HttpServletRequest.class(javax/servlet/http:HttpServletRequest.class)] [loading javax/ws/rs/DELETE.class(javax/ws/rs:DELETE.class)] [loading javax/ws/rs/FormParam.class(javax/ws/rs:FormParam.class)] [loading javax/ws/rs/GET.class(javax/ws/rs:GET.class)] [loading javax/ws/rs/POST.class(javax/ws/rs:POST.class)] [loading javax/ws/rs/PUT.class(javax/ws/rs:PUT.class)] [loading javax/ws/rs/Path.class(javax/ws/rs:Path.class)] [loading javax/ws/rs/PathParam.class(javax/ws/rs:PathParam.class)] [loading javax/ws/rs/Produces.class(javax/ws/rs:Produces.class)] [loading javax/ws/rs/QueryParam.class(javax/ws/rs:QueryParam.class)] [loading javax/ws/rs/core/Context.class(javax/ws/rs/core:Context.class)] [loading javax/ws/rs/core/SecurityContext.class(javax/ws/rs/core:SecurityContext.class)] [loading javax/ws/rs/core/UriInfo.class(javax/ws/rs/core:UriInfo.class)] [loading org/apache/hadoop/security/authentication/client/PseudoAuthenticator.class(org/apache/hadoop/security/authentication/client:PseudoAuthenticator.class)] [loading com/sun/jersey/api/NotFoundException.class(com/sun/jersey/api:NotFoundException.class)] [loading java/net/URI.class(java/net:URI.class)] [loading org/apache/commons/lang/StringUtils.class(org/apache/commons/lang:StringUtils.class)] [loading org/apache/hadoop/fs/FileStatus.class(org/apache/hadoop/fs:FileStatus.class)] [loading org/apache/hadoop/fs/FileSystem.class(org/apache/hadoop/fs:FileSystem.class)] [loading java/util/Date.class(java/util:Date.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceAudience.class(org/apache/hadoop/hive/common/classification:InterfaceAudience.class)] [loading org/apache/hadoop/hive/metastore/HiveMetaStoreClient.class(org/apache/hadoop/hive/metastore:HiveMetaStoreClient.class)] [loading org/apache/hive/hcatalog/common/HCatUtil.class(org/apache/hive/hcatalog/common:HCatUtil.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceAudience$Private.class(org/apache/hadoop/hive/common/classification:InterfaceAudience$Private.class)] [loading com/sun/jersey/api/wadl/config/WadlGeneratorConfig.class(com/sun/jersey/api/wadl/config:WadlGeneratorConfig.class)] [loading com/sun/jersey/api/wadl/config/WadlGeneratorDescription.class(com/sun/jersey/api/wadl/config:WadlGeneratorDescription.class)] [loading com/sun/jersey/server/wadl/generators/resourcedoc/WadlGeneratorResourceDocSupport.class(com/sun/jersey/server/wadl/generators/resourcedoc:WadlGeneratorResourceDocSupport.class)] [loading com/sun/jersey/api/core/PackagesResourceConfig.class(com/sun/jersey/api/core:PackagesResourceConfig.class)] [loading com/sun/jersey/spi/container/servlet/ServletContainer.class(com/sun/jersey/spi/container/servlet:ServletContainer.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceStability.class(org/apache/hadoop/hive/common/classification:InterfaceStability.class)] [loading org/apache/hadoop/hdfs/web/AuthFilter.class(org/apache/hadoop/hdfs/web:AuthFilter.class)] [loading org/apache/hadoop/util/GenericOptionsParser.class(org/apache/hadoop/util:GenericOptionsParser.class)] [loading org/eclipse/jetty/rewrite/handler/RedirectPatternRule.class(org/eclipse/jetty/rewrite/handler:RedirectPatternRule.class)] [loading org/eclipse/jetty/rewrite/handler/RewriteHandler.class(org/eclipse/jetty/rewrite/handler:RewriteHandler.class)] [loading org/eclipse/jetty/server/Handler.class(org/eclipse/jetty/server:Handler.class)] [loading org/eclipse/jetty/server/Server.class(org/eclipse/jetty/server:Server.class)] [loading org/eclipse/jetty/server/handler/HandlerList.class(org/eclipse/jetty/server/handler:HandlerList.class)] [loading org/eclipse/jetty/servlet/FilterHolder.class(org/eclipse/jetty/servlet:FilterHolder.class)] [loading org/eclipse/jetty/servlet/FilterMapping.class(org/eclipse/jetty/servlet:FilterMapping.class)] [loading org/eclipse/jetty/servlet/ServletContextHandler.class(org/eclipse/jetty/servlet:ServletContextHandler.class)] [loading org/eclipse/jetty/servlet/ServletHolder.class(org/eclipse/jetty/servlet:ServletHolder.class)] [loading org/slf4j/bridge/SLF4JBridgeHandler.class(org/slf4j/bridge:SLF4JBridgeHandler.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceAudience$LimitedPrivate.class(org/apache/hadoop/hive/common/classification:InterfaceAudience$LimitedPrivate.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceStability$Unstable.class(org/apache/hadoop/hive/common/classification:InterfaceStability$Unstable.class)] [loading org/apache/hadoop/hive/metastore/api/MetaException.class(org/apache/hadoop/hive/metastore/api:MetaException.class)] [loading org/apache/hadoop/io/Text.class(org/apache/hadoop/io:Text.class)] [loading org/apache/hadoop/security/Credentials.class(org/apache/hadoop/security:Credentials.class)] [loading org/apache/hadoop/security/token/Token.class(org/apache/hadoop/security/token:Token.class)] [loading org/apache/thrift/TException.class(org/apache/thrift:TException.class)] [loading java/io/Closeable.class(java/io:Closeable.class)] [loading java/io/Flushable.class(java/io:Flushable.class)] [loading org/apache/hadoop/security/Groups.class(org/apache/hadoop/security:Groups.class)] [loading java/util/HashSet.class(java/util:HashSet.class)] [loading java/util/Set.class(java/util:Set.class)] [loading java/util/concurrent/ConcurrentHashMap.class(java/util/concurrent:ConcurrentHashMap.class)] [loading java/io/UnsupportedEncodingException.class(java/io:UnsupportedEncodingException.class)] [loading org/apache/zookeeper/CreateMode.class(org/apache/zookeeper:CreateMode.class)] [loading org/apache/zookeeper/KeeperException.class(org/apache/zookeeper:KeeperException.class)] [loading org/apache/zookeeper/WatchedEvent.class(org/apache/zookeeper:WatchedEvent.class)] [loading org/apache/zookeeper/Watcher.class(org/apache/zookeeper:Watcher.class)] [loading org/apache/zookeeper/ZooDefs.class(org/apache/zookeeper:ZooDefs.class)] [loading org/apache/zookeeper/ZooDefs$Ids.class(org/apache/zookeeper:ZooDefs$Ids.class)] [loading org/apache/zookeeper/ZooKeeper.class(org/apache/zookeeper:ZooKeeper.class)] [loading org/apache/hadoop/io/NullWritable.class(org/apache/hadoop/io:NullWritable.class)] [loading org/apache/hadoop/mapreduce/InputSplit.class(org/apache/hadoop/mapreduce:InputSplit.class)] [loading org/apache/hadoop/mapreduce/RecordReader.class(org/apache/hadoop/mapreduce:RecordReader.class)] [loading org/apache/hadoop/mapreduce/TaskAttemptContext.class(org/apache/hadoop/mapreduce:TaskAttemptContext.class)] [loading java/net/URLConnection.class(java/net:URLConnection.class)] [loading java/util/Collection.class(java/util:Collection.class)] [loading javax/ws/rs/core/UriBuilder.class(javax/ws/rs/core:UriBuilder.class)] [loading java/io/OutputStreamWriter.class(java/io:OutputStreamWriter.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceStability$Evolving.class(org/apache/hadoop/hive/common/classification:InterfaceStability$Evolving.class)] [loading org/apache/zookeeper/data/Stat.class(org/apache/zookeeper/data:Stat.class)] [loading org/apache/hadoop/mapreduce/InputFormat.class(org/apache/hadoop/mapreduce:InputFormat.class)] [loading org/apache/hadoop/mapreduce/JobContext.class(org/apache/hadoop/mapreduce:JobContext.class)] [loading org/apache/hadoop/mapred/JobClient.class(org/apache/hadoop/mapred:JobClient.class)] [loading org/apache/hadoop/mapred/JobConf.class(org/apache/hadoop/mapred:JobConf.class)] [loading org/apache/hadoop/mapred/RunningJob.class(org/apache/hadoop/mapred:RunningJob.class)] [loading java/io/DataInput.class(java/io:DataInput.class)] [loading java/io/DataOutput.class(java/io:DataOutput.class)] [loading org/apache/hadoop/conf/Configured.class(org/apache/hadoop/conf:Configured.class)] [loading org/apache/hadoop/fs/permission/FsPermission.class(org/apache/hadoop/fs/permission:FsPermission.class)] [loading org/apache/hadoop/mapreduce/Job.class(org/apache/hadoop/mapreduce:Job.class)] [loading org/apache/hadoop/mapreduce/JobID.class(org/apache/hadoop/mapreduce:JobID.class)] [loading org/apache/hadoop/mapreduce/lib/output/NullOutputFormat.class(org/apache/hadoop/mapreduce/lib/output:NullOutputFormat.class)] [loading org/apache/hadoop/mapreduce/security/token/delegation/DelegationTokenIdentifier.class(org/apache/hadoop/mapreduce/security/token/delegation:DelegationTokenIdentifier.class)] [loading org/apache/hadoop/util/Tool.class(org/apache/hadoop/util:Tool.class)] [loading org/apache/hadoop/conf/Configurable.class(org/apache/hadoop/conf:Configurable.class)] [loading java/lang/ClassNotFoundException.class(java/lang:ClassNotFoundException.class)] [loading org/apache/hadoop/mapreduce/Mapper.class(org/apache/hadoop/mapreduce:Mapper.class)] [loading java/util/Iterator.class(java/util:Iterator.class)] [loading java/util/LinkedList.class(java/util:LinkedList.class)] [loading java/util/concurrent/ExecutorService.class(java/util/concurrent:ExecutorService.class)] [loading java/util/concurrent/Executors.class(java/util/concurrent:Executors.class)] [loading java/util/concurrent/TimeUnit.class(java/util/concurrent:TimeUnit.class)] [loading org/apache/hadoop/mapreduce/Mapper$Context.class(org/apache/hadoop/mapreduce:Mapper$Context.class)] [loading java/lang/Process.class(java/lang:Process.class)] [loading java/lang/StringBuilder.class(java/lang:StringBuilder.class)] [loading java/lang/ProcessBuilder.class(java/lang:ProcessBuilder.class)] [loading java/lang/annotation/Target.class(java/lang/annotation:Target.class)] [loading java/lang/annotation/ElementType.class(java/lang/annotation:ElementType.class)] [loading java/lang/annotation/Retention.class(java/lang/annotation:Retention.class)] [loading java/lang/annotation/RetentionPolicy.class(java/lang/annotation:RetentionPolicy.class)] [loading java/lang/annotation/Annotation.class(java/lang/annotation:Annotation.class)] [loading java/lang/SuppressWarnings.class(java/lang:SuppressWarnings.class)] [loading java/lang/Override.class(java/lang:Override.class)] [loading javax/ws/rs/HttpMethod.class(javax/ws/rs:HttpMethod.class)] [loading java/lang/Deprecated.class(java/lang:Deprecated.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$3.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatDelegator$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/LauncherDelegator$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatException$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/LogRetriever$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonUtils$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/HDFSStorage$1.class] [done in 6731 ms] [WARNING] Javadoc Warnings [WARNING] Nov 25, 2013 8:52:23 PM com.sun.jersey.wadl.resourcedoc.ResourceDoclet start [WARNING] INFO: Wrote /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/resourcedoc.xml [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat --- [INFO] Compiling 9 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/test/java/org/apache/hive/hcatalog/templeton/TestDesc.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-webhcat --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/hive-webhcat-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-webhcat --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/hive-webhcat-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat/0.13.0-SNAPSHOT/hive-webhcat-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat/0.13.0-SNAPSHOT/hive-webhcat-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog HBase Storage Handler 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-storage-handler --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-hbase-storage-handler --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/gen-java added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-storage-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-storage-handler --- [INFO] Compiling 36 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-storage-handler --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-storage-handler --- [INFO] Compiling 21 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-storage-handler --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-storage-handler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hbase-storage-handler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-storage-handler --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HWI 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hwi --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hwi (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi --- [INFO] Compiling 6 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.13.0-SNAPSHOT/hive-hwi-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.13.0-SNAPSHOT/hive-hwi-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive ODBC 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-odbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/odbc (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-odbc --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/odbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-odbc/0.13.0-SNAPSHOT/hive-odbc-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Aggregator 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-aggregator --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-aggregator --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims-aggregator/0.13.0-SNAPSHOT/hive-shims-aggregator-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive TestUtils 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-testutils --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/testutils (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-testutils --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-testutils --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.13.0-SNAPSHOT/hive-testutils-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.13.0-SNAPSHOT/hive-testutils-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Packaging 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-packaging --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/packaging (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-packaging --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-packaging --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-packaging --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/packaging/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-packaging/0.13.0-SNAPSHOT/hive-packaging-0.13.0-SNAPSHOT.pom [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [2.890s] [INFO] Hive Ant Utilities ................................ SUCCESS [7.580s] [INFO] Hive Shims Common ................................. SUCCESS [3.378s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.225s] [INFO] Hive Shims Secure Common .......................... SUCCESS [2.819s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.458s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.796s] [INFO] Hive Shims ........................................ SUCCESS [3.597s] [INFO] Hive Common ....................................... SUCCESS [14.500s] [INFO] Hive Serde ........................................ SUCCESS [11.538s] [INFO] Hive Metastore .................................... SUCCESS [25.097s] [INFO] Hive Query Language ............................... SUCCESS [50.810s] [INFO] Hive Service ...................................... SUCCESS [4.608s] [INFO] Hive JDBC ......................................... SUCCESS [1.894s] [INFO] Hive Beeline ...................................... SUCCESS [1.810s] [INFO] Hive CLI .......................................... SUCCESS [1.358s] [INFO] Hive Contrib ...................................... SUCCESS [1.080s] [INFO] Hive HBase Handler ................................ SUCCESS [2.622s] [INFO] Hive HCatalog ..................................... SUCCESS [0.498s] [INFO] Hive HCatalog Core ................................ SUCCESS [3.663s] [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [0.878s] [INFO] Hive HCatalog Server Extensions ................... SUCCESS [1.492s] [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [1.224s] [INFO] Hive HCatalog Webhcat ............................. SUCCESS [8.919s] [INFO] Hive HCatalog HBase Storage Handler ............... SUCCESS [4.152s] [INFO] Hive HWI .......................................... SUCCESS [0.660s] [INFO] Hive ODBC ......................................... SUCCESS [0.136s] [INFO] Hive Shims Aggregator ............................. SUCCESS [0.208s] [INFO] Hive TestUtils .................................... SUCCESS [0.251s] [INFO] Hive Packaging .................................... SUCCESS [0.297s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:48.139s [INFO] Finished at: Mon Nov 25 20:52:30 EST 2013 [INFO] Final Memory: 62M/372M [INFO] ------------------------------------------------------------------------ + mvn -B test -Dmaven.repo.local=/data/hive-ptest/working/maven -Dtest=TestDummy [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 Grammar /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g is up to date - build skipped [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes [INFO] [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/classes >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input=663 ms, enhance=374 ms, total=1037 ms. Consult the log for full details [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g is up to date - build skipped Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g is up to date - build skipped [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 6 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources) @ hive-exec --- [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java added. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-exec --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/ql/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-exec --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-exec --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Service 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive JDBC 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-jdbc --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-jdbc --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-jdbc --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Beeline 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-beeline --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-beeline --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-beeline --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/beeline/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-beeline --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-beeline --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive CLI 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-cli --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-cli --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-cli --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Contrib 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contrib --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-contrib --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-handler --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-core --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-core --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-core --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-core --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Pig Adapter 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-pig-adapter --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-pig-adapter --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-pig-adapter --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/hcatalog-pig-adapter/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-pig-adapter --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-pig-adapter --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Server Extensions 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-server-extensions --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-server-extensions --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-server-extensions --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/server-extensions/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-server-extensions --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-server-extensions --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Webhcat Java Client 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat-java-client --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat-java-client --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat-java-client --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/java-client/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat-java-client --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat-java-client --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Webhcat 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhcat --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-javadoc-plugin:2.4:javadoc (resourcesdoc.xml) @ hive-webhcat --- [INFO] Setting property: classpath.resource.loader.class => 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'. [INFO] Setting property: velocimacro.messages.on => 'false'. [INFO] Setting property: resource.loader => 'classpath'. [INFO] Setting property: resource.manager.logwhenfound => 'false'. [INFO] ************************************************************** [INFO] Starting Jakarta Velocity v1.4 [INFO] RuntimeInstance initializing. [INFO] Default Properties File: org/apache/velocity/runtime/defaults/velocity.properties [INFO] Default ResourceManager initializing. (class org.apache.velocity.runtime.resource.ResourceManagerImpl) [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader [INFO] ClasspathResourceLoader : initialization starting. [INFO] ClasspathResourceLoader : initialization complete. [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.resource.ResourceCacheImpl) [INFO] Default ResourceManager initialization complete. [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Literal [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Include [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Foreach [INFO] Created: 20 parsers. [INFO] Velocimacro : initialization starting. [INFO] Velocimacro : adding VMs from VM library template : VM_global_library.vm [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in any resource loader. [INFO] Velocimacro : error using VM library template VM_global_library.vm : org.apache.velocity.exception.ResourceNotFoundException: Unable to find resource 'VM_global_library.vm' [INFO] Velocimacro : VM library template macro registration complete. [INFO] Velocimacro : allowInline = true : VMs can be defined inline in templates [INFO] Velocimacro : allowInlineToOverride = false : VMs defined inline may NOT replace previous VM definitions [INFO] Velocimacro : allowInlineLocal = false : VMs defined inline will be global in scope if allowed. [INFO] Velocimacro : initialization complete. [INFO] Velocity successfully started. Loading source files for package org.apache.hive.hcatalog.templeton... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleExceptionMapper.java] [parsing completed 26ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JsonBuilder.java] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JobItemBean.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelegator.java] [parsing completed 19ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImpl.java] [parsing completed 22ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecBean.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TempletonDelegator.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDelegator.java] [parsing completed 14ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DatabaseDesc.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteBean.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.java] [parsing completed 19ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java] [parsing completed 124ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BadParam.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ColumnDesc.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PartitionDesc.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CatchallExceptionMapper.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.java] [parsing completed 44ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBean.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteDelegator.java] [parsing completed 10ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/GroupPermissionsDesc.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatException.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueException.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableLikeDesc.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableDesc.java] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/WadlConfig.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BusyException.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecService.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/NotAuthorizedException.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/EnqueueBean.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleWebException.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SecureProxySupport.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/MaxByteArrayOutputStream.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ProxyUserSupport.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/UgiFactory.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CallbackFailedException.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TablePropertyDesc.java] [parsing completed 0ms] Loading source files for package org.apache.hive.hcatalog.templeton.tool... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage.java] [parsing completed 14ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullRecordReader.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/PigJobIDParser.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonStorage.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobIDParser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonUtils.java] [parsing completed 18ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStorage.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/DelegationTokenCache.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JarJobIDParser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobSubmissionConstants.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeperCleanup.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobStateTracker.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NotFoundException.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/SingleInputFormat.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LogRetriever.java] [parsing completed 10ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullSplit.java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSCleanup.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HiveJobIDParser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LaunchMapper.java] [parsing completed 9ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialExecService.java] [parsing completed 2ms] Constructing Javadoc information... [search path for source files: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java] [search path for class files: /usr/java/jdk1.6.0_34/jre/lib/resources.jar,/usr/java/jdk1.6.0_34/jre/lib/rt.jar,/usr/java/jdk1.6.0_34/jre/lib/sunrsasign.jar,/usr/java/jdk1.6.0_34/jre/lib/jsse.jar,/usr/java/jdk1.6.0_34/jre/lib/jce.jar,/usr/java/jdk1.6.0_34/jre/lib/charsets.jar,/usr/java/jdk1.6.0_34/jre/lib/modules/jdk.boot.jar,/usr/java/jdk1.6.0_34/jre/classes,/usr/java/jdk1.6.0_34/jre/lib/ext/localedata.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunpkcs11.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunjce_provider.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/dnsns.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar,/data/hive-ptest/working/maven/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar,/data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar,/data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/mail-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail/1.4.1/activation.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar,/data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar,/data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar,/data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar,/data/hive-ptest/working/maven/org/eclipse/jetty/aggregate/jetty-all-server/7.6.0.v20120127/jetty-all-server-7.6.0.v20120127.jar,/data/hive-ptest/working/maven/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar,/data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar,/data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar,/data/hive-ptest/working/maven/com/googlecode/javaewah/JavaEWAH/0.3.2/JavaEWAH-0.3.2.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes,/data/hive-ptest/working/maven/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar,/data/hive-ptest/working/maven/org/apache/velocity/velocity/1.5/velocity-1.5.jar,/data/hive-ptest/working/maven/com/jolbox/bonecp/0.7.1.RELEASE/bonecp-0.7.1.RELEASE.jar,/data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar,/data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar,/data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar,/data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar,/data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar,/data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/ST4/4.0.4/ST4-4.0.4.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-exec/1.1/commons-exec-1.1.jar,/data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar,/data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar,/data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar,/data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar,/data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar,/data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar,/data/hive-ptest/working/maven/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar,/data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar,/usr/java/jdk1.6.0_34/jre/../lib/tools.jar,/data/hive-ptest/working/maven/org/apache/ant/ant/1.9.1/ant-1.9.1.jar,/data/hive-ptest/working/maven/io/netty/netty/3.4.0.Final/netty-3.4.0.Final.jar,/data/hive-ptest/working/maven/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar,/data/hive-ptest/working/maven/com/sun/jersey/contribs/wadl-resourcedoc-doclet/1.4/wadl-resourcedoc-doclet-1.4.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-mapred/1.7.5/avro-mapred-1.7.5.jar,/data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar,/data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar,/data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar,/data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar,/data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar,/data/hive-ptest/working/apache-svn-trunk-source/common/target/classes,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5-tests.jar,/data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/geronimo-annotation_1.0_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-ipc/1.7.5/avro-ipc-1.7.5.jar,/data/hive-ptest/working/maven/javolution/javolution/5.5.1/javolution-5.5.1.jar,/data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-servlet/1.14/jersey-servlet-1.14.jar,/data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar,/data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar,/data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar,/data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar,/data/hive-ptest/working/maven/com/esotericsoftware/kryo/kryo/2.22/kryo-2.22.jar,/data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar,/data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar,/data/hive-ptest/working/apache-svn-trunk-source/cli/target/classes,/data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar,/data/hive-ptest/working/maven/org/codehaus/groovy/groovy-all/2.1.6/groovy-all-2.1.6.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_cs.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_de_DE.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_es.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_fr.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_hu.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_it.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ja_JP.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ko_KR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pl.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_pt_BR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ru.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_CN.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_TW.jar,/data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes,/data/hive-ptest/working/maven/asm/asm-commons/3.1/asm-commons-3.1.jar,/data/hive-ptest/working/maven/org/iq80/snappy/snappy/0.2/snappy-0.2.jar,/data/hive-ptest/working/apache-svn-trunk-source/service/target/classes,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/activation.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jsr173_1.0_api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb1-impl.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-tools/1.2.1/hadoop-tools-1.2.1.jar,/data/hive-ptest/working/maven/asm/asm-tree/3.1/asm-tree-3.1.jar,/data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.2/paranamer-2.2.jar,/data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar,/data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar,/data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar,/data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jaspic_1.0_spec/1.0/geronimo-jaspic_1.0_spec-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar,/data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar,/data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar,/data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/classes,/data/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jta_1.1_spec/1.1.1/geronimo-jta_1.1_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/json/json/20090211/json-20090211.jar,/data/hive-ptest/working/maven/org/apache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar,/data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar,/data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes,/data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar,/data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar] [loading javax/ws/rs/core/Response.class(javax/ws/rs/core:Response.class)] [loading javax/ws/rs/ext/ExceptionMapper.class(javax/ws/rs/ext:ExceptionMapper.class)] [loading javax/ws/rs/ext/Provider.class(javax/ws/rs/ext:Provider.class)] [loading java/io/IOException.class(java/io:IOException.class)] [loading java/util/Map.class(java/util:Map.class)] [loading java/util/HashMap.class(java/util:HashMap.class)] [loading javax/ws/rs/core/MediaType.class(javax/ws/rs/core:MediaType.class)] [loading org/codehaus/jackson/map/ObjectMapper.class(org/codehaus/jackson/map:ObjectMapper.class)] [loading java/lang/Throwable.class(java/lang:Throwable.class)] [loading java/io/Serializable.class(java/io:Serializable.class)] [loading java/lang/Object.class(java/lang:Object.class)] [loading java/lang/String.class(java/lang:String.class)] [loading java/io/ByteArrayOutputStream.class(java/io:ByteArrayOutputStream.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes/org/apache/hadoop/hive/ql/ErrorMsg.class] [loading org/eclipse/jetty/http/HttpStatus.class(org/eclipse/jetty/http:HttpStatus.class)] [loading java/lang/Integer.class(java/lang:Integer.class)] [loading org/apache/hadoop/mapred/JobStatus.class(org/apache/hadoop/mapred:JobStatus.class)] [loading org/apache/hadoop/mapred/JobProfile.class(org/apache/hadoop/mapred:JobProfile.class)] [loading java/lang/Long.class(java/lang:Long.class)] [loading java/util/ArrayList.class(java/util:ArrayList.class)] [loading java/util/List.class(java/util:List.class)] [loading org/apache/commons/logging/Log.class(org/apache/commons/logging:Log.class)] [loading org/apache/commons/logging/LogFactory.class(org/apache/commons/logging:LogFactory.class)] [loading org/apache/hadoop/conf/Configuration.class(org/apache/hadoop/conf:Configuration.class)] [loading java/lang/Enum.class(java/lang:Enum.class)] [loading java/lang/Comparable.class(java/lang:Comparable.class)] [loading java/lang/Exception.class(java/lang:Exception.class)] [loading java/io/FileNotFoundException.class(java/io:FileNotFoundException.class)] [loading java/net/URISyntaxException.class(java/net:URISyntaxException.class)] [loading org/apache/commons/exec/ExecuteException.class(org/apache/commons/exec:ExecuteException.class)] [loading java/security/PrivilegedExceptionAction.class(java/security:PrivilegedExceptionAction.class)] [loading org/apache/hadoop/fs/Path.class(org/apache/hadoop/fs:Path.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/conf/HiveConf.class] [loading org/apache/hadoop/security/UserGroupInformation.class(org/apache/hadoop/security:UserGroupInformation.class)] [loading org/apache/hadoop/util/StringUtils.class(org/apache/hadoop/util:StringUtils.class)] [loading org/apache/hadoop/util/ToolRunner.class(org/apache/hadoop/util:ToolRunner.class)] [loading java/io/File.class(java/io:File.class)] [loading java/net/URL.class(java/net:URL.class)] [loading org/apache/hadoop/util/VersionInfo.class(org/apache/hadoop/util:VersionInfo.class)] [loading java/lang/Iterable.class(java/lang:Iterable.class)] [loading org/apache/hadoop/io/Writable.class(org/apache/hadoop/io:Writable.class)] [loading java/lang/InterruptedException.class(java/lang:InterruptedException.class)] [loading java/io/BufferedReader.class(java/io:BufferedReader.class)] [loading java/io/InputStream.class(java/io:InputStream.class)] [loading java/io/InputStreamReader.class(java/io:InputStreamReader.class)] [loading java/io/OutputStream.class(java/io:OutputStream.class)] [loading java/io/PrintWriter.class(java/io:PrintWriter.class)] [loading java/util/Map$Entry.class(java/util:Map$Entry.class)] [loading java/util/concurrent/Semaphore.class(java/util/concurrent:Semaphore.class)] [loading org/apache/commons/exec/CommandLine.class(org/apache/commons/exec:CommandLine.class)] [loading org/apache/commons/exec/DefaultExecutor.class(org/apache/commons/exec:DefaultExecutor.class)] [loading org/apache/commons/exec/ExecuteWatchdog.class(org/apache/commons/exec:ExecuteWatchdog.class)] [loading org/apache/commons/exec/PumpStreamHandler.class(org/apache/commons/exec:PumpStreamHandler.class)] [loading org/apache/hadoop/util/Shell.class(org/apache/hadoop/util:Shell.class)] [loading java/lang/Thread.class(java/lang:Thread.class)] [loading java/lang/Runnable.class(java/lang:Runnable.class)] [loading org/apache/hadoop/mapred/JobID.class(org/apache/hadoop/mapred:JobID.class)] [loading java/util/Arrays.class(java/util:Arrays.class)] [loading javax/xml/bind/annotation/XmlRootElement.class(javax/xml/bind/annotation:XmlRootElement.class)] [loading java/net/InetAddress.class(java/net:InetAddress.class)] [loading java/net/UnknownHostException.class(java/net:UnknownHostException.class)] [loading java/text/MessageFormat.class(java/text:MessageFormat.class)] [loading java/util/Collections.class(java/util:Collections.class)] [loading java/util/regex/Matcher.class(java/util/regex:Matcher.class)] [loading java/util/regex/Pattern.class(java/util/regex:Pattern.class)] [loading javax/servlet/http/HttpServletRequest.class(javax/servlet/http:HttpServletRequest.class)] [loading javax/ws/rs/DELETE.class(javax/ws/rs:DELETE.class)] [loading javax/ws/rs/FormParam.class(javax/ws/rs:FormParam.class)] [loading javax/ws/rs/GET.class(javax/ws/rs:GET.class)] [loading javax/ws/rs/POST.class(javax/ws/rs:POST.class)] [loading javax/ws/rs/PUT.class(javax/ws/rs:PUT.class)] [loading javax/ws/rs/Path.class(javax/ws/rs:Path.class)] [loading javax/ws/rs/PathParam.class(javax/ws/rs:PathParam.class)] [loading javax/ws/rs/Produces.class(javax/ws/rs:Produces.class)] [loading javax/ws/rs/QueryParam.class(javax/ws/rs:QueryParam.class)] [loading javax/ws/rs/core/Context.class(javax/ws/rs/core:Context.class)] [loading javax/ws/rs/core/SecurityContext.class(javax/ws/rs/core:SecurityContext.class)] [loading javax/ws/rs/core/UriInfo.class(javax/ws/rs/core:UriInfo.class)] [loading org/apache/hadoop/security/authentication/client/PseudoAuthenticator.class(org/apache/hadoop/security/authentication/client:PseudoAuthenticator.class)] [loading com/sun/jersey/api/NotFoundException.class(com/sun/jersey/api:NotFoundException.class)] [loading java/net/URI.class(java/net:URI.class)] [loading org/apache/commons/lang/StringUtils.class(org/apache/commons/lang:StringUtils.class)] [loading org/apache/hadoop/fs/FileStatus.class(org/apache/hadoop/fs:FileStatus.class)] [loading org/apache/hadoop/fs/FileSystem.class(org/apache/hadoop/fs:FileSystem.class)] [loading java/util/Date.class(java/util:Date.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceAudience.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes/org/apache/hive/hcatalog/common/HCatUtil.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceAudience$Private.class] [loading com/sun/jersey/api/wadl/config/WadlGeneratorConfig.class(com/sun/jersey/api/wadl/config:WadlGeneratorConfig.class)] [loading com/sun/jersey/api/wadl/config/WadlGeneratorDescription.class(com/sun/jersey/api/wadl/config:WadlGeneratorDescription.class)] [loading com/sun/jersey/server/wadl/generators/resourcedoc/WadlGeneratorResourceDocSupport.class(com/sun/jersey/server/wadl/generators/resourcedoc:WadlGeneratorResourceDocSupport.class)] [loading com/sun/jersey/api/core/PackagesResourceConfig.class(com/sun/jersey/api/core:PackagesResourceConfig.class)] [loading com/sun/jersey/spi/container/servlet/ServletContainer.class(com/sun/jersey/spi/container/servlet:ServletContainer.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceStability.class] [loading org/apache/hadoop/hdfs/web/AuthFilter.class(org/apache/hadoop/hdfs/web:AuthFilter.class)] [loading org/apache/hadoop/util/GenericOptionsParser.class(org/apache/hadoop/util:GenericOptionsParser.class)] [loading org/eclipse/jetty/rewrite/handler/RedirectPatternRule.class(org/eclipse/jetty/rewrite/handler:RedirectPatternRule.class)] [loading org/eclipse/jetty/rewrite/handler/RewriteHandler.class(org/eclipse/jetty/rewrite/handler:RewriteHandler.class)] [loading org/eclipse/jetty/server/Handler.class(org/eclipse/jetty/server:Handler.class)] [loading org/eclipse/jetty/server/Server.class(org/eclipse/jetty/server:Server.class)] [loading org/eclipse/jetty/server/handler/HandlerList.class(org/eclipse/jetty/server/handler:HandlerList.class)] [loading org/eclipse/jetty/servlet/FilterHolder.class(org/eclipse/jetty/servlet:FilterHolder.class)] [loading org/eclipse/jetty/servlet/FilterMapping.class(org/eclipse/jetty/servlet:FilterMapping.class)] [loading org/eclipse/jetty/servlet/ServletContextHandler.class(org/eclipse/jetty/servlet:ServletContextHandler.class)] [loading org/eclipse/jetty/servlet/ServletHolder.class(org/eclipse/jetty/servlet:ServletHolder.class)] [loading org/slf4j/bridge/SLF4JBridgeHandler.class(org/slf4j/bridge:SLF4JBridgeHandler.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceAudience$LimitedPrivate.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceStability$Unstable.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes/org/apache/hadoop/hive/metastore/api/MetaException.class] [loading org/apache/hadoop/io/Text.class(org/apache/hadoop/io:Text.class)] [loading org/apache/hadoop/security/Credentials.class(org/apache/hadoop/security:Credentials.class)] [loading org/apache/hadoop/security/token/Token.class(org/apache/hadoop/security/token:Token.class)] [loading org/apache/thrift/TException.class(org/apache/thrift:TException.class)] [loading java/io/Closeable.class(java/io:Closeable.class)] [loading java/io/Flushable.class(java/io:Flushable.class)] [loading org/apache/hadoop/security/Groups.class(org/apache/hadoop/security:Groups.class)] [loading java/util/HashSet.class(java/util:HashSet.class)] [loading java/util/Set.class(java/util:Set.class)] [loading java/util/concurrent/ConcurrentHashMap.class(java/util/concurrent:ConcurrentHashMap.class)] [loading java/io/UnsupportedEncodingException.class(java/io:UnsupportedEncodingException.class)] [loading org/apache/zookeeper/CreateMode.class(org/apache/zookeeper:CreateMode.class)] [loading org/apache/zookeeper/KeeperException.class(org/apache/zookeeper:KeeperException.class)] [loading org/apache/zookeeper/WatchedEvent.class(org/apache/zookeeper:WatchedEvent.class)] [loading org/apache/zookeeper/Watcher.class(org/apache/zookeeper:Watcher.class)] [loading org/apache/zookeeper/ZooDefs.class(org/apache/zookeeper:ZooDefs.class)] [loading org/apache/zookeeper/ZooDefs$Ids.class(org/apache/zookeeper:ZooDefs$Ids.class)] [loading org/apache/zookeeper/ZooKeeper.class(org/apache/zookeeper:ZooKeeper.class)] [loading org/apache/hadoop/io/NullWritable.class(org/apache/hadoop/io:NullWritable.class)] [loading org/apache/hadoop/mapreduce/InputSplit.class(org/apache/hadoop/mapreduce:InputSplit.class)] [loading org/apache/hadoop/mapreduce/RecordReader.class(org/apache/hadoop/mapreduce:RecordReader.class)] [loading org/apache/hadoop/mapreduce/TaskAttemptContext.class(org/apache/hadoop/mapreduce:TaskAttemptContext.class)] [loading java/net/URLConnection.class(java/net:URLConnection.class)] [loading java/util/Collection.class(java/util:Collection.class)] [loading javax/ws/rs/core/UriBuilder.class(javax/ws/rs/core:UriBuilder.class)] [loading java/io/OutputStreamWriter.class(java/io:OutputStreamWriter.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes/org/apache/hadoop/hive/common/classification/InterfaceStability$Evolving.class] [loading org/apache/zookeeper/data/Stat.class(org/apache/zookeeper/data:Stat.class)] [loading org/apache/hadoop/mapreduce/InputFormat.class(org/apache/hadoop/mapreduce:InputFormat.class)] [loading org/apache/hadoop/mapreduce/JobContext.class(org/apache/hadoop/mapreduce:JobContext.class)] [loading org/apache/hadoop/mapred/JobClient.class(org/apache/hadoop/mapred:JobClient.class)] [loading org/apache/hadoop/mapred/JobConf.class(org/apache/hadoop/mapred:JobConf.class)] [loading org/apache/hadoop/mapred/RunningJob.class(org/apache/hadoop/mapred:RunningJob.class)] [loading java/io/DataInput.class(java/io:DataInput.class)] [loading java/io/DataOutput.class(java/io:DataOutput.class)] [loading org/apache/hadoop/conf/Configured.class(org/apache/hadoop/conf:Configured.class)] [loading org/apache/hadoop/fs/permission/FsPermission.class(org/apache/hadoop/fs/permission:FsPermission.class)] [loading org/apache/hadoop/mapreduce/Job.class(org/apache/hadoop/mapreduce:Job.class)] [loading org/apache/hadoop/mapreduce/JobID.class(org/apache/hadoop/mapreduce:JobID.class)] [loading org/apache/hadoop/mapreduce/lib/output/NullOutputFormat.class(org/apache/hadoop/mapreduce/lib/output:NullOutputFormat.class)] [loading org/apache/hadoop/mapreduce/security/token/delegation/DelegationTokenIdentifier.class(org/apache/hadoop/mapreduce/security/token/delegation:DelegationTokenIdentifier.class)] [loading org/apache/hadoop/util/Tool.class(org/apache/hadoop/util:Tool.class)] [loading org/apache/hadoop/conf/Configurable.class(org/apache/hadoop/conf:Configurable.class)] [loading java/lang/ClassNotFoundException.class(java/lang:ClassNotFoundException.class)] [loading org/apache/hadoop/mapreduce/Mapper.class(org/apache/hadoop/mapreduce:Mapper.class)] [loading java/util/Iterator.class(java/util:Iterator.class)] [loading java/util/LinkedList.class(java/util:LinkedList.class)] [loading java/util/concurrent/ExecutorService.class(java/util/concurrent:ExecutorService.class)] [loading java/util/concurrent/Executors.class(java/util/concurrent:Executors.class)] [loading java/util/concurrent/TimeUnit.class(java/util/concurrent:TimeUnit.class)] [loading org/apache/hadoop/mapreduce/Mapper$Context.class(org/apache/hadoop/mapreduce:Mapper$Context.class)] [loading java/lang/Process.class(java/lang:Process.class)] [loading java/lang/StringBuilder.class(java/lang:StringBuilder.class)] [loading java/lang/ProcessBuilder.class(java/lang:ProcessBuilder.class)] [loading java/lang/annotation/Target.class(java/lang/annotation:Target.class)] [loading java/lang/annotation/ElementType.class(java/lang/annotation:ElementType.class)] [loading java/lang/annotation/Retention.class(java/lang/annotation:Retention.class)] [loading java/lang/annotation/RetentionPolicy.class(java/lang/annotation:RetentionPolicy.class)] [loading java/lang/annotation/Annotation.class(java/lang/annotation:Annotation.class)] [loading java/lang/SuppressWarnings.class(java/lang:SuppressWarnings.class)] [loading java/lang/Override.class(java/lang:Override.class)] [loading javax/ws/rs/HttpMethod.class(javax/ws/rs:HttpMethod.class)] [loading java/lang/Deprecated.class(java/lang:Deprecated.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$3.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatDelegator$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/LauncherDelegator$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/HcatException$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$2$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/LogRetriever$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonUtils$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/org/apache/hive/hcatalog/templeton/tool/HDFSStorage$1.class] [done in 7781 ms] 8 warnings [WARNING] Javadoc Warnings [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java:26: package org.apache.hadoop.hive.shims.HadoopShims does not exist [WARNING] import org.apache.hadoop.hive.shims.HadoopShims.WebHCatJTShim; [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegator.java:27: package org.apache.hadoop.hive.shims does not exist [WARNING] import org.apache.hadoop.hive.shims.ShimLoader; [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java:25: package org.apache.hadoop.hive.shims.HadoopShims does not exist [WARNING] import org.apache.hadoop.hive.shims.HadoopShims.WebHCatJTShim; [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java:26: package org.apache.hadoop.hive.shims does not exist [WARNING] import org.apache.hadoop.hive.shims.ShimLoader; [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegator.java:72: cannot find symbol [WARNING] symbol : class WebHCatJTShim [WARNING] location: class org.apache.hive.hcatalog.templeton.StatusDelegator [WARNING] static QueueStatusBean makeStatus(WebHCatJTShim tracker, [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java:25: package org.apache.hadoop.hive.shims.HadoopShims does not exist [WARNING] import org.apache.hadoop.hive.shims.HadoopShims.WebHCatJTShim; [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.java:26: package org.apache.hadoop.hive.shims does not exist [WARNING] import org.apache.hadoop.hive.shims.ShimLoader; [WARNING] ^ [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java:37: package org.apache.hadoop.hive.shims does not exist [WARNING] import org.apache.hadoop.hive.shims.ShimLoader; [WARNING] ^ [WARNING] Nov 25, 2013 8:53:25 PM com.sun.jersey.wadl.resourcedoc.ResourceDoclet start [WARNING] INFO: Wrote /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/classes/resourcedoc.xml [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/svr/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-webhcat --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog HBase Storage Handler 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-hbase-storage-handler --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/gen-java added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-storage-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-storage-handler --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-storage-handler --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/storage-handlers/hbase/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-storage-handler --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-storage-handler --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HWI 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive ODBC 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Aggregator 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive TestUtils 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils --- [INFO] No tests to run. [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Packaging 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-packaging --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-packaging --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/packaging/target/tmp/conf [INFO] Executed tasks [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [2.037s] [INFO] Hive Ant Utilities ................................ SUCCESS [3.444s] [INFO] Hive Shims Common ................................. SUCCESS [1.184s] [INFO] Hive Shims 0.20 ................................... SUCCESS [1.285s] [INFO] Hive Shims Secure Common .......................... SUCCESS [0.847s] [INFO] Hive Shims 0.20S .................................. SUCCESS [0.374s] [INFO] Hive Shims 0.23 ................................... SUCCESS [0.942s] [INFO] Hive Shims ........................................ SUCCESS [0.202s] [INFO] Hive Common ....................................... SUCCESS [3.382s] [INFO] Hive Serde ........................................ SUCCESS [0.859s] [INFO] Hive Metastore .................................... SUCCESS [5.863s] [INFO] Hive Query Language ............................... SUCCESS [12.074s] [INFO] Hive Service ...................................... SUCCESS [0.366s] [INFO] Hive JDBC ......................................... SUCCESS [0.364s] [INFO] Hive Beeline ...................................... SUCCESS [0.445s] [INFO] Hive CLI .......................................... SUCCESS [0.706s] [INFO] Hive Contrib ...................................... SUCCESS [0.807s] [INFO] Hive HBase Handler ................................ SUCCESS [1.305s] [INFO] Hive HCatalog ..................................... SUCCESS [0.451s] [INFO] Hive HCatalog Core ................................ SUCCESS [0.870s] [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [0.697s] [INFO] Hive HCatalog Server Extensions ................... SUCCESS [0.816s] [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [0.637s] [INFO] Hive HCatalog Webhcat ............................. SUCCESS [10.331s] [INFO] Hive HCatalog HBase Storage Handler ............... SUCCESS [0.605s] [INFO] Hive HWI .......................................... SUCCESS [0.441s] [INFO] Hive ODBC ......................................... SUCCESS [0.182s] [INFO] Hive Shims Aggregator ............................. SUCCESS [0.127s] [INFO] Hive TestUtils .................................... SUCCESS [0.127s] [INFO] Hive Packaging .................................... SUCCESS [0.490s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 54.864s [INFO] Finished at: Mon Nov 25 20:53:28 EST 2013 [INFO] Final Memory: 38M/102M [INFO] ------------------------------------------------------------------------ + cd itests + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive Integration - Parent [INFO] Hive Integration - Custom Serde [INFO] Hive Integration - Testing Utilities [INFO] Hive Integration - Unit Tests [INFO] Hive Integration - HCatalog Unit Tests [INFO] Hive Integration - Test Serde [INFO] Hive Integration - QFile Tests [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - Parent 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it/0.13.0-SNAPSHOT/hive-it-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - Custom Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-custom-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-custom-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-custom-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-custom-serde --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-it-custom-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-custom-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-it-custom-serde --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-it-custom-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-custom-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it-custom-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - Testing Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-util --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/util (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-util --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-util --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-util --- [INFO] Compiling 42 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/util/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] 2 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[45,73] cannot find symbol symbol : variable HIVEJOBPROGRESS location: class org.apache.hadoop.hive.conf.HiveConf.ConfVars [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[57,38] cannot find symbol symbol : method getCounters() location: class org.apache.hadoop.hive.ql.exec.Operator<capture#395 of ? extends org.apache.hadoop.hive.ql.plan.OperatorDesc> [INFO] 2 errors [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive Integration - Parent ......................... SUCCESS [3.617s] [INFO] Hive Integration - Custom Serde ................... SUCCESS [8.790s] [INFO] Hive Integration - Testing Utilities .............. FAILURE [5.644s] [INFO] Hive Integration - Unit Tests ..................... SKIPPED [INFO] Hive Integration - HCatalog Unit Tests ............ SKIPPED [INFO] Hive Integration - Test Serde ..................... SKIPPED [INFO] Hive Integration - QFile Tests .................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 19.687s [INFO] Finished at: Mon Nov 25 20:53:50 EST 2013 [INFO] Final Memory: 25M/59M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-it-util: Compilation failure: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[45,73] cannot find symbol [ERROR] symbol : variable HIVEJOBPROGRESS [ERROR] location: class org.apache.hadoop.hive.conf.HiveConf.ConfVars [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[57,38] cannot find symbol [ERROR] symbol : method getCounters() [ERROR] location: class org.apache.hadoop.hive.ql.exec.Operator<capture#395 of ? extends org.apache.hadoop.hive.ql.plan.OperatorDesc> [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-it-util + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12615765
          Hide
          Xuefu Zhang added a comment -

          Above failure was due to bad patch #4, which didn't have all the changes. Patch #5 fixed it.

          Show
          Xuefu Zhang added a comment - Above failure was due to bad patch #4, which didn't have all the changes. Patch #5 fixed it.
          Hide
          Hive QA added a comment -

          Overall: -1 no tests executed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12615729/HIVE-5706.4.patch

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/440/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/440/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
          + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + cd /data/hive-ptest/working/
          + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-440/source-prep.txt
          + [[ false == \t\r\u\e ]]
          + mkdir -p maven ivy
          + [[ svn = \s\v\n ]]
          + [[ -n '' ]]
          + [[ -d apache-svn-trunk-source ]]
          + [[ ! -d apache-svn-trunk-source/.svn ]]
          + [[ ! -d apache-svn-trunk-source ]]
          + cd apache-svn-trunk-source
          + svn revert -R .
          Reverted 'packaging/pom.xml'
          Reverted 'itests/hcatalog-unit/pom.xml'
          Reverted 'hcatalog/storage-handlers/hbase/pom.xml'
          ++ egrep -v '^X|^Performing status on external'
          ++ awk '{print $2}'
          ++ svn status --no-ignore
          + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target
          + svn update
          U    ql/src/test/queries/clientpositive/annotate_stats_groupby.q
          U    ql/src/test/results/clientpositive/transform_ppr2.q.out
          U    ql/src/test/results/clientpositive/pcr.q.out
          U    ql/src/test/results/clientpositive/auto_sortmerge_join_11.q.out
          U    ql/src/test/results/clientpositive/bucket_map_join_2.q.out
          U    ql/src/test/results/clientpositive/sample7.q.out
          U    ql/src/test/results/clientpositive/rand_partitionpruner2.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_6.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin_negative.q.out
          U    ql/src/test/results/clientpositive/udf_explode.q.out
          U    ql/src/test/results/clientpositive/sample2.q.out
          U    ql/src/test/results/clientpositive/router_join_ppr.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_1.q.out
          U    ql/src/test/results/clientpositive/annotate_stats_filter.q.out
          U    ql/src/test/results/clientpositive/stats11.q.out
          U    ql/src/test/results/clientpositive/smb_mapjoin_15.q.out
          U    ql/src/test/results/clientpositive/input23.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin1.q.out
          U    ql/src/test/results/clientpositive/join_map_ppr.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin10.q.out
          U    ql/src/test/results/clientpositive/join33.q.out
          U    ql/src/test/results/clientpositive/groupby_sort_skew_1.q.out
          U    ql/src/test/results/clientpositive/annotate_stats_limit.q.out
          U    ql/src/test/results/clientpositive/louter_join_ppr.q.out
          U    ql/src/test/results/clientpositive/annotate_stats_groupby.q.out
          U    ql/src/test/results/clientpositive/sort_merge_join_desc_7.q.out
          U    ql/src/test/results/clientpositive/sample9.q.out
          U    ql/src/test/results/clientpositive/union_ppr.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_8.q.out
          U    ql/src/test/results/clientpositive/udtf_explode.q.out
          U    ql/src/test/results/clientpositive/auto_join_reordering_values.q.out
          U    ql/src/test/results/clientpositive/sample4.q.out
          U    ql/src/test/results/clientpositive/push_or.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin8.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_3.q.out
          U    ql/src/test/results/clientpositive/join17.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin3.q.out
          U    ql/src/test/results/clientpositive/annotate_stats_select.q.out
          U    ql/src/test/results/clientpositive/join26.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin12.q.out
          U    ql/src/test/results/clientpositive/join35.q.out
          U    ql/src/test/results/clientpositive/groupby_ppr.q.out
          U    ql/src/test/results/clientpositive/input_part7.q.out
          U    ql/src/test/results/clientpositive/sample10.q.out
          U    ql/src/test/results/clientpositive/outer_join_ppr.q.out
          U    ql/src/test/results/clientpositive/input_part2.q.out
          U    ql/src/test/results/clientpositive/transform_ppr1.q.out
          U    ql/src/test/results/clientpositive/regexp_extract.q.out
          U    ql/src/test/results/clientpositive/join32_lessSize.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin_negative3.q.out
          U    ql/src/test/results/clientpositive/annotate_stats_join.q.out
          U    ql/src/test/results/clientpositive/groupby_map_ppr_multi_distinct.q.out
          U    ql/src/test/results/clientpositive/ppd_union_view.q.out
          U    ql/src/test/results/clientpositive/bucket_map_join_1.q.out
          U    ql/src/test/results/clientpositive/sample6.q.out
          U    ql/src/test/results/clientpositive/rand_partitionpruner1.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_5.q.out
          U    ql/src/test/results/clientpositive/sample1.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin5.q.out
          U    ql/src/test/results/clientpositive/filter_join_breaktask.q.out
          U    ql/src/test/results/clientpositive/ppr_allchildsarenull.q.out
          U    ql/src/test/results/clientpositive/input_part9.q.out
          U    ql/src/test/results/clientpositive/join32.q.out
          U    ql/src/test/results/clientpositive/annotate_stats_part.q.out
          U    ql/src/test/results/clientpositive/load_dyn_part8.q.out
          U    ql/src/test/results/clientpositive/join_filters_overlap.q.out
          U    ql/src/test/results/clientpositive/union22.q.out
          U    ql/src/test/results/clientpositive/groupby_sort_6.q.out
          U    ql/src/test/results/clientpositive/auto_sortmerge_join_12.q.out
          U    ql/src/test/results/clientpositive/groupby_sort_1.q.out
          U    ql/src/test/results/clientpositive/groupby_map_ppr.q.out
          U    ql/src/test/results/clientpositive/ppd_vc.q.out
          U    ql/src/test/results/clientpositive/sort_merge_join_desc_6.q.out
          U    ql/src/test/results/clientpositive/sample8.q.out
          U    ql/src/test/results/clientpositive/rand_partitionpruner3.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_7.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin7.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_2.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin2.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin11.q.out
          U    ql/src/test/results/clientpositive/input42.q.out
          U    ql/src/test/results/clientpositive/join34.q.out
          U    ql/src/test/results/clientpositive/ppd_join_filter.q.out
          U    ql/src/test/results/clientpositive/union24.q.out
          U    ql/src/test/results/clientpositive/metadataonly1.q.out
          U    ql/src/test/results/clientpositive/input_part1.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin_negative2.q.out
          U    ql/src/test/results/clientpositive/join9.q.out
          U    ql/src/test/results/clientpositive/groupby_ppr_multi_distinct.q.out
          U    ql/src/test/results/clientpositive/sample5.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin9.q.out
          U    ql/src/test/results/clientpositive/bucketcontext_4.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin4.q.out
          U    ql/src/test/results/clientpositive/bucketmapjoin13.q.out
          U    ql/src/test/results/clientpositive/smb_mapjoin_13.q.out
          U    ql/src/test/results/clientpositive/alter_partition_coltype.q.out
          U    ql/src/java/org/apache/hadoop/hive/ql/plan/Statistics.java
          U    ql/src/java/org/apache/hadoop/hive/ql/stats/StatsUtils.java
          U    ql/src/java/org/apache/hadoop/hive/ql/optimizer/stats/annotation/StatsRulesProcFactory.java
          
          Fetching external item into 'hcatalog/src/test/e2e/harness'
          Updated external to revision 1545461.
          
          Updated to revision 1545461.
          + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
          + patchFilePath=/data/hive-ptest/working/scratch/build.patch
          + [[ -f /data/hive-ptest/working/scratch/build.patch ]]
          + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
          + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
          Going to apply patch with: patch -p0
          patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFloor.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorizationContext.java
          patching file ql/src/test/results/clientpositive/decimal_udf.q.out
          patching file ql/src/test/results/clientpositive/literal_decimal.q.out
          patching file ql/src/test/results/clientpositive/udf4.q.out
          patching file ql/src/test/results/clientpositive/udf7.q.out
          patching file ql/src/test/results/clientpositive/vectorization_short_regress.q.out
          patching file ql/src/test/results/clientpositive/vectorized_math_funcs.q.out
          patching file ql/src/test/results/compiler/plan/udf4.q.xml
          + [[ maven == \m\a\v\e\n ]]
          + rm -rf /data/hive-ptest/working/maven/org/apache/hive
          + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Build Order:
          [INFO] 
          [INFO] Hive
          [INFO] Hive Ant Utilities
          [INFO] Hive Shims Common
          [INFO] Hive Shims 0.20
          [INFO] Hive Shims Secure Common
          [INFO] Hive Shims 0.20S
          [INFO] Hive Shims 0.23
          [INFO] Hive Shims
          [INFO] Hive Common
          [INFO] Hive Serde
          [INFO] Hive Metastore
          [INFO] Hive Query Language
          [INFO] Hive Service
          [INFO] Hive JDBC
          [INFO] Hive Beeline
          [INFO] Hive CLI
          [INFO] Hive Contrib
          [INFO] Hive HBase Handler
          [INFO] Hive HCatalog
          [INFO] Hive HCatalog Core
          [INFO] Hive HCatalog Pig Adapter
          [INFO] Hive HCatalog Server Extensions
          [INFO] Hive HCatalog Webhcat Java Client
          [INFO] Hive HCatalog Webhcat
          [INFO] Hive HCatalog HBase Storage Handler
          [INFO] Hive HWI
          [INFO] Hive ODBC
          [INFO] Hive Shims Aggregator
          [INFO] Hive TestUtils
          [INFO] Hive Packaging
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
          [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
          [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
          [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
          [WARNING] JAR will be empty - no content was marked for inclusion!
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
          [INFO] Reading assembly descriptor: src/assemble/uberjar.xml
          [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
          Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
          NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
          [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
          with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
          [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 4 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
          [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Serde 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
          [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
          [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Metastore 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
          ANTLR Parser Generator  Version 3.4
          org/apache/hadoop/hive/metastore/parser/Filter.g
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
          [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
          [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
          DataNucleus Enhancer : Classpath
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
          >>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
          >>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
          >>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          >>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
          >>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar
          >>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
          >>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
          >>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
          >>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
          >>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
          >>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
          >>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
          >>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
          >>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
          >>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
          >>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
          >>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
          >>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
          >>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
          >>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
          >>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
          >>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar
          >>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
          >>  /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
          >>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
          >>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
          >>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
          >>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
          >>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
          >>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
          >>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
          >>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
          >>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
          DataNucleus Enhancer completed with success for 25 classes. Timings : input=594 ms, enhance=947 ms, total=1541 ms. Consult the log for full details
          
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
          [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Query Language 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
          Generating vector expression code
          Generating vector expression test code
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
          ANTLR Parser Generator  Version 3.4
          org/apache/hadoop/hive/ql/parse/HiveLexer.g
          org/apache/hadoop/hive/ql/parse/HiveParser.g
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5: 
          Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10
          
          As a result, alternative(s) 10 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
          Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
          Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
          Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
          Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
          Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
          Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
          Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
          Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4
          
          As a result, alternative(s) 4 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116: 
          Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
          Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7
          
          As a result, alternative(s) 7 were disabled for that input
          warning(200): SelectClauseParser.g:149:5: 
          Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): SelectClauseParser.g:149:5: 
          Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:127:2: 
          Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:25: 
          Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:25: 
          Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:25: 
          Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:179:68: 
          Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): FromClauseParser.g:237:16: 
          Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:68:4: 
          Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:108:5: 
          Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:121:5: 
          Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:133:5: 
          Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:144:5: 
          Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:155:5: 
          Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:172:7: 
          Decision can match input such as "STAR" using multiple alternatives: 1, 2
          
          As a result, alternative(s) 2 were disabled for that input
          warning(200): IdentifiersParser.g:185:5: 
          Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:185:5: 
          Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:185:5: 
          Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
          
          As a result, alternative(s) 6 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
          
          As a result, alternative(s) 3 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:267:5: 
          Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
          
          As a result, alternative(s) 8 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:399:5: 
          Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
          
          As a result, alternative(s) 9 were disabled for that input
          warning(200): IdentifiersParser.g:524:5: 
          Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
          
          As a result, alternative(s) 3 were disabled for that input
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
          [INFO] Compiling 1390 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
          [INFO] -------------------------------------------------------------
          [WARNING] COMPILATION WARNING : 
          [INFO] -------------------------------------------------------------
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 4 warnings 
          [INFO] -------------------------------------------------------------
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR : 
          [INFO] -------------------------------------------------------------
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[122,45] cannot find symbol
          symbol  : class GenericUDFCeil
          location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[124,45] cannot find symbol
          symbol  : class GenericUDFFloor
          location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[137,45] cannot find symbol
          symbol  : class GenericUDFOPNegative
          location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[143,45] cannot find symbol
          symbol  : class GenericUDFOPPositive
          location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[144,45] cannot find symbol
          symbol  : class GenericUDFPower
          location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[355,25] cannot find symbol
          symbol  : class GenericUDFOPNegative
          location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[355,65] cannot find symbol
          symbol  : class GenericUDFOPPositive
          location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[554,31] cannot find symbol
          symbol  : class GenericUDFOPPositive
          location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[185,30] cannot find symbol
          symbol  : class GenericUDFOPNegative
          location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[186,30] cannot find symbol
          symbol  : class GenericUDFOPPositive
          location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[224,30] cannot find symbol
          symbol  : class GenericUDFFloor
          location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[225,30] cannot find symbol
          symbol  : class GenericUDFCeil
          location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[231,30] cannot find symbol
          symbol  : class GenericUDFPower
          location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[195,33] cannot find symbol
          symbol  : class GenericUDFFloor
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[197,32] cannot find symbol
          symbol  : class GenericUDFCeil
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[198,35] cannot find symbol
          symbol  : class GenericUDFCeil
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[212,33] cannot find symbol
          symbol  : class GenericUDFPower
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[213,31] cannot find symbol
          symbol  : class GenericUDFPower
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[255,36] cannot find symbol
          symbol  : class GenericUDFOPPositive
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[256,36] cannot find symbol
          symbol  : class GenericUDFOPNegative
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[1527,12] cannot find symbol
          symbol  : class GenericUDFOPPositive
          location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [INFO] 21 errors 
          [INFO] -------------------------------------------------------------
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive .............................................. SUCCESS [4.794s]
          [INFO] Hive Ant Utilities ................................ SUCCESS [6.988s]
          [INFO] Hive Shims Common ................................. SUCCESS [3.336s]
          [INFO] Hive Shims 0.20 ................................... SUCCESS [2.391s]
          [INFO] Hive Shims Secure Common .......................... SUCCESS [2.708s]
          [INFO] Hive Shims 0.20S .................................. SUCCESS [1.379s]
          [INFO] Hive Shims 0.23 ................................... SUCCESS [2.978s]
          [INFO] Hive Shims ........................................ SUCCESS [3.667s]
          [INFO] Hive Common ....................................... SUCCESS [7.570s]
          [INFO] Hive Serde ........................................ SUCCESS [15.242s]
          [INFO] Hive Metastore .................................... SUCCESS [26.331s]
          [INFO] Hive Query Language ............................... FAILURE [31.092s]
          [INFO] Hive Service ...................................... SKIPPED
          [INFO] Hive JDBC ......................................... SKIPPED
          [INFO] Hive Beeline ...................................... SKIPPED
          [INFO] Hive CLI .......................................... SKIPPED
          [INFO] Hive Contrib ...................................... SKIPPED
          [INFO] Hive HBase Handler ................................ SKIPPED
          [INFO] Hive HCatalog ..................................... SKIPPED
          [INFO] Hive HCatalog Core ................................ SKIPPED
          [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
          [INFO] Hive HCatalog Server Extensions ................... SKIPPED
          [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
          [INFO] Hive HCatalog Webhcat ............................. SKIPPED
          [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
          [INFO] Hive HWI .......................................... SKIPPED
          [INFO] Hive ODBC ......................................... SKIPPED
          [INFO] Hive Shims Aggregator ............................. SKIPPED
          [INFO] Hive TestUtils .................................... SKIPPED
          [INFO] Hive Packaging .................................... SKIPPED
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 1:51.494s
          [INFO] Finished at: Mon Nov 25 18:53:52 EST 2013
          [INFO] Final Memory: 52M/381M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure: Compilation failure:
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[122,45] cannot find symbol
          [ERROR] symbol  : class GenericUDFCeil
          [ERROR] location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[124,45] cannot find symbol
          [ERROR] symbol  : class GenericUDFFloor
          [ERROR] location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[137,45] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPNegative
          [ERROR] location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[143,45] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPPositive
          [ERROR] location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[144,45] cannot find symbol
          [ERROR] symbol  : class GenericUDFPower
          [ERROR] location: package org.apache.hadoop.hive.ql.udf.generic
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[355,25] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPNegative
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[355,65] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPPositive
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[554,31] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPPositive
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[185,30] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPNegative
          [ERROR] location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[186,30] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPPositive
          [ERROR] location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[224,30] cannot find symbol
          [ERROR] symbol  : class GenericUDFFloor
          [ERROR] location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[225,30] cannot find symbol
          [ERROR] symbol  : class GenericUDFCeil
          [ERROR] location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[231,30] cannot find symbol
          [ERROR] symbol  : class GenericUDFPower
          [ERROR] location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[195,33] cannot find symbol
          [ERROR] symbol  : class GenericUDFFloor
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[197,32] cannot find symbol
          [ERROR] symbol  : class GenericUDFCeil
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[198,35] cannot find symbol
          [ERROR] symbol  : class GenericUDFCeil
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[212,33] cannot find symbol
          [ERROR] symbol  : class GenericUDFPower
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[213,31] cannot find symbol
          [ERROR] symbol  : class GenericUDFPower
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[255,36] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPPositive
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[256,36] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPNegative
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[1527,12] cannot find symbol
          [ERROR] symbol  : class GenericUDFOPPositive
          [ERROR] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-exec
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12615729

          Show
          Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12615729/HIVE-5706.4.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/440/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/440/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-440/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . Reverted 'packaging/pom.xml' Reverted 'itests/hcatalog-unit/pom.xml' Reverted 'hcatalog/storage-handlers/hbase/pom.xml' ++ egrep -v '^X|^Performing status on external' ++ awk '{print $2}' ++ svn status --no-ignore + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target + svn update U ql/src/test/queries/clientpositive/annotate_stats_groupby.q U ql/src/test/results/clientpositive/transform_ppr2.q.out U ql/src/test/results/clientpositive/pcr.q.out U ql/src/test/results/clientpositive/auto_sortmerge_join_11.q.out U ql/src/test/results/clientpositive/bucket_map_join_2.q.out U ql/src/test/results/clientpositive/sample7.q.out U ql/src/test/results/clientpositive/rand_partitionpruner2.q.out U ql/src/test/results/clientpositive/bucketcontext_6.q.out U ql/src/test/results/clientpositive/bucketmapjoin_negative.q.out U ql/src/test/results/clientpositive/udf_explode.q.out U ql/src/test/results/clientpositive/sample2.q.out U ql/src/test/results/clientpositive/router_join_ppr.q.out U ql/src/test/results/clientpositive/bucketcontext_1.q.out U ql/src/test/results/clientpositive/annotate_stats_filter.q.out U ql/src/test/results/clientpositive/stats11.q.out U ql/src/test/results/clientpositive/smb_mapjoin_15.q.out U ql/src/test/results/clientpositive/input23.q.out U ql/src/test/results/clientpositive/bucketmapjoin1.q.out U ql/src/test/results/clientpositive/join_map_ppr.q.out U ql/src/test/results/clientpositive/bucketmapjoin10.q.out U ql/src/test/results/clientpositive/join33.q.out U ql/src/test/results/clientpositive/groupby_sort_skew_1.q.out U ql/src/test/results/clientpositive/annotate_stats_limit.q.out U ql/src/test/results/clientpositive/louter_join_ppr.q.out U ql/src/test/results/clientpositive/annotate_stats_groupby.q.out U ql/src/test/results/clientpositive/sort_merge_join_desc_7.q.out U ql/src/test/results/clientpositive/sample9.q.out U ql/src/test/results/clientpositive/union_ppr.q.out U ql/src/test/results/clientpositive/bucketcontext_8.q.out U ql/src/test/results/clientpositive/udtf_explode.q.out U ql/src/test/results/clientpositive/auto_join_reordering_values.q.out U ql/src/test/results/clientpositive/sample4.q.out U ql/src/test/results/clientpositive/push_or.q.out U ql/src/test/results/clientpositive/bucketmapjoin8.q.out U ql/src/test/results/clientpositive/bucketcontext_3.q.out U ql/src/test/results/clientpositive/join17.q.out U ql/src/test/results/clientpositive/bucketmapjoin3.q.out U ql/src/test/results/clientpositive/annotate_stats_select.q.out U ql/src/test/results/clientpositive/join26.q.out U ql/src/test/results/clientpositive/bucketmapjoin12.q.out U ql/src/test/results/clientpositive/join35.q.out U ql/src/test/results/clientpositive/groupby_ppr.q.out U ql/src/test/results/clientpositive/input_part7.q.out U ql/src/test/results/clientpositive/sample10.q.out U ql/src/test/results/clientpositive/outer_join_ppr.q.out U ql/src/test/results/clientpositive/input_part2.q.out U ql/src/test/results/clientpositive/transform_ppr1.q.out U ql/src/test/results/clientpositive/regexp_extract.q.out U ql/src/test/results/clientpositive/join32_lessSize.q.out U ql/src/test/results/clientpositive/bucketmapjoin_negative3.q.out U ql/src/test/results/clientpositive/annotate_stats_join.q.out U ql/src/test/results/clientpositive/groupby_map_ppr_multi_distinct.q.out U ql/src/test/results/clientpositive/ppd_union_view.q.out U ql/src/test/results/clientpositive/bucket_map_join_1.q.out U ql/src/test/results/clientpositive/sample6.q.out U ql/src/test/results/clientpositive/rand_partitionpruner1.q.out U ql/src/test/results/clientpositive/bucketcontext_5.q.out U ql/src/test/results/clientpositive/sample1.q.out U ql/src/test/results/clientpositive/bucketmapjoin5.q.out U ql/src/test/results/clientpositive/filter_join_breaktask.q.out U ql/src/test/results/clientpositive/ppr_allchildsarenull.q.out U ql/src/test/results/clientpositive/input_part9.q.out U ql/src/test/results/clientpositive/join32.q.out U ql/src/test/results/clientpositive/annotate_stats_part.q.out U ql/src/test/results/clientpositive/load_dyn_part8.q.out U ql/src/test/results/clientpositive/join_filters_overlap.q.out U ql/src/test/results/clientpositive/union22.q.out U ql/src/test/results/clientpositive/groupby_sort_6.q.out U ql/src/test/results/clientpositive/auto_sortmerge_join_12.q.out U ql/src/test/results/clientpositive/groupby_sort_1.q.out U ql/src/test/results/clientpositive/groupby_map_ppr.q.out U ql/src/test/results/clientpositive/ppd_vc.q.out U ql/src/test/results/clientpositive/sort_merge_join_desc_6.q.out U ql/src/test/results/clientpositive/sample8.q.out U ql/src/test/results/clientpositive/rand_partitionpruner3.q.out U ql/src/test/results/clientpositive/bucketcontext_7.q.out U ql/src/test/results/clientpositive/bucketmapjoin7.q.out U ql/src/test/results/clientpositive/bucketcontext_2.q.out U ql/src/test/results/clientpositive/bucketmapjoin2.q.out U ql/src/test/results/clientpositive/bucketmapjoin11.q.out U ql/src/test/results/clientpositive/input42.q.out U ql/src/test/results/clientpositive/join34.q.out U ql/src/test/results/clientpositive/ppd_join_filter.q.out U ql/src/test/results/clientpositive/union24.q.out U ql/src/test/results/clientpositive/metadataonly1.q.out U ql/src/test/results/clientpositive/input_part1.q.out U ql/src/test/results/clientpositive/bucketmapjoin_negative2.q.out U ql/src/test/results/clientpositive/join9.q.out U ql/src/test/results/clientpositive/groupby_ppr_multi_distinct.q.out U ql/src/test/results/clientpositive/sample5.q.out U ql/src/test/results/clientpositive/bucketmapjoin9.q.out U ql/src/test/results/clientpositive/bucketcontext_4.q.out U ql/src/test/results/clientpositive/bucketmapjoin4.q.out U ql/src/test/results/clientpositive/bucketmapjoin13.q.out U ql/src/test/results/clientpositive/smb_mapjoin_13.q.out U ql/src/test/results/clientpositive/alter_partition_coltype.q.out U ql/src/java/org/apache/hadoop/hive/ql/plan/Statistics.java U ql/src/java/org/apache/hadoop/hive/ql/stats/StatsUtils.java U ql/src/java/org/apache/hadoop/hive/ql/optimizer/stats/annotation/StatsRulesProcFactory.java Fetching external item into 'hcatalog/src/test/e2e/harness' Updated external to revision 1545461. Updated to revision 1545461. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFCeil.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFFloor.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPNegative.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPositive.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPower.java patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorizationContext.java patching file ql/src/test/results/clientpositive/decimal_udf.q.out patching file ql/src/test/results/clientpositive/literal_decimal.q.out patching file ql/src/test/results/clientpositive/udf4.q.out patching file ql/src/test/results/clientpositive/udf7.q.out patching file ql/src/test/results/clientpositive/vectorization_short_regress.q.out patching file ql/src/test/results/clientpositive/vectorized_math_funcs.q.out patching file ql/src/test/results/compiler/plan/udf4.q.xml + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant --- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure --- [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims --- [INFO] Reading assembly descriptor: src/assemble/uberjar.xml [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion. [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing. Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact. NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic! [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde --- [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde --- [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/metastore/parser/Filter.g [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore --- [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input=594 ms, enhance=947 ms, total=1541 ms. Consult the log for full details [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveParser.g warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5: Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10 As a result, alternative(s) 10 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116: Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7 As a result, alternative(s) 7 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:127:2: Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25: Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25: Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25: Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:108:5: Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:121:5: Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:133:5: Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:144:5: Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:155:5: Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:172:7: Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:524:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1390 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[122,45] cannot find symbol symbol : class GenericUDFCeil location: package org.apache.hadoop.hive.ql.udf.generic [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[124,45] cannot find symbol symbol : class GenericUDFFloor location: package org.apache.hadoop.hive.ql.udf.generic [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[137,45] cannot find symbol symbol : class GenericUDFOPNegative location: package org.apache.hadoop.hive.ql.udf.generic [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[143,45] cannot find symbol symbol : class GenericUDFOPPositive location: package org.apache.hadoop.hive.ql.udf.generic [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[144,45] cannot find symbol symbol : class GenericUDFPower location: package org.apache.hadoop.hive.ql.udf.generic [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[355,25] cannot find symbol symbol : class GenericUDFOPNegative location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[355,65] cannot find symbol symbol : class GenericUDFOPPositive location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java:[554,31] cannot find symbol symbol : class GenericUDFOPPositive location: class org.apache.hadoop.hive.ql.exec.vector.VectorizationContext [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[185,30] cannot find symbol symbol : class GenericUDFOPNegative location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[186,30] cannot find symbol symbol : class GenericUDFOPPositive location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[224,30] cannot find symbol symbol : class GenericUDFFloor location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[225,30] cannot find symbol symbol : class GenericUDFCeil location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[231,30] cannot find symbol symbol : class GenericUDFPower location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[195,33] cannot find symbol symbol : class GenericUDFFloor location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[197,32] cannot find symbol symbol : class GenericUDFCeil location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java:[198,35] cannot find symbol symbol : class GenericUDFCeil location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry