Hive
  1. Hive
  2. HIVE-5218

datanucleus does not work with MS SQLServer in Hive metastore

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Duplicate
    • Affects Version/s: 0.12.0
    • Fix Version/s: 0.13.0
    • Component/s: Metastore
    • Labels:
      None

      Description

      HIVE-3632 upgraded datanucleus version to 3.2.x, however, this version of datanucleus doesn't work with SQLServer as the metastore. The problem is that datanucleus tries to use fully qualified object name to find a table in the database but couldn't find it.

      If I downgrade the version to HIVE-2084, SQLServer works fine.

      It could be a bug in datanucleus.

      This is the detailed exception I'm getting when using datanucleus 3.2.x with SQL Server:

      FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTa
      sk. MetaException(message:javax.jdo.JDOException: Exception thrown calling table
      .exists() for a2ee36af45e9f46c19e995bfd2d9b5fd1hivemetastore..SEQUENCE_TABLE
              at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExc
      eption(NucleusJDOHelper.java:596)
              at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPe
      rsistenceManager.java:732)
      …
              at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawS
      tore.java:111)
              at $Proxy0.createTable(Unknown Source)
              at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_tabl
      e_core(HiveMetaStore.java:1071)
              at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_tabl
      e_with_environment_context(HiveMetaStore.java:1104)
      …
              at $Proxy11.create_table_with_environment_context(Unknown Source)
              at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$cr
      eate_table_with_environment_context.getResult(ThriftHiveMetastore.java:6417)
              at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$cr
      eate_table_with_environment_context.getResult(ThriftHiveMetastore.java:6401)
      
      NestedThrowablesStackTrace:
      com.microsoft.sqlserver.jdbc.SQLServerException: There is already an object name
      d 'SEQUENCE_TABLE' in the database.
              at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError
      (SQLServerException.java:197)
              at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServ
      erStatement.java:1493)
              at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQ
      LServerStatement.java:775)
              at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute
      (SQLServerStatement.java:676)
              at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4615)
              at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLSe
      rverConnection.java:1400)
              at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLSer
      verStatement.java:179)
              at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLS
      erverStatement.java:154)
              at com.microsoft.sqlserver.jdbc.SQLServerStatement.execute(SQLServerStat
      ement.java:649)
              at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:300)
              at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(A
      bstractTable.java:760)
              at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementLi
      st(AbstractTable.java:711)
              at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.
      java:425)
              at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.
      java:488)
              at org.datanucleus.store.rdbms.valuegenerator.TableGenerator.repositoryE
      xists(TableGenerator.java:242)
              at org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obt
      ainGenerationBlock(AbstractRDBMSGenerator.java:86)
              at org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerati
      onBlock(AbstractGenerator.java:197)
              at org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractG
      enerator.java:105)
              at org.datanucleus.store.rdbms.RDBMSStoreManager.getStrategyValueForGene
      rator(RDBMSStoreManager.java:2019)
              at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractS
      toreManager.java:1385)
              at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl
      .java:3727)
              at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.jav
      a:2574)
              at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOS
      tateManager.java:526)
              at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(O
      bjectProviderFactoryImpl.java:202)
              at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNe
      w(ExecutionContextImpl.java:1326)
              at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionC
      ontextImpl.java:2123)
              at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionConte
      xtImpl.java:1972)
              at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextIm
      pl.java:1820)
              at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionC
      ontextThreadedImpl.java:217)
              at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPe
      rsistenceManager.java:727)
              at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersi
      stenceManager.java:752)
              at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.
      java:646)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
      java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
      sorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:601)
              at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawS
      tore.java:111)
              at $Proxy0.createTable(Unknown Source)
              at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_tabl
      e_core(HiveMetaStore.java:1071)
              at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_tabl
      e_with_environment_context(HiveMetaStore.java:1104)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
      java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
      sorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:601)
              at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHM
      SHandler.java:103)
              at $Proxy11.create_table_with_environment_context(Unknown Source)
              at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$cr
      eate_table_with_environment_context.getResult(ThriftHiveMetastore.java:6417)
              at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$cr
      eate_table_with_environment_context.getResult(ThriftHiveMetastore.java:6401)
              at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
              at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
              at org.apache.hadoop.hive.metastore.TSetIpAddressProcessor.process(TSetI
      pAddressProcessor.java:48)
              at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadP
      oolServer.java:206)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
      java:1110)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
      .java:603)
              at java.lang.Thread.run(Thread.java:722)
      
      1. 0001-HIVE-5218-datanucleus-does-not-work-with-SQLServer-i.patch
        2 kB
        Sergey Soldatov
      2. HIVE-5218.2.patch
        0.5 kB
        Brock Noland
      3. HIVE-5218.patch
        3 kB
        shanyu zhao
      4. HIVE-5218-trunk.patch
        0.8 kB
        Thejas M Nair
      5. HIVE-5218-trunk.patch
        0.8 kB
        shanyu zhao
      6. HIVE-5218-v2.patch
        0.5 kB
        shanyu zhao

        Issue Links

          Activity

          Hide
          Konstantin Boudnik added a comment -

          With downgrade - as in HIVE-2084 - JDK7 stops working, actually.

          Show
          Konstantin Boudnik added a comment - With downgrade - as in HIVE-2084 - JDK7 stops working, actually.
          Hide
          Brock Noland added a comment -

          Seems like a DN bug to me? Have you tried the latest version of DN?

          Show
          Brock Noland added a comment - Seems like a DN bug to me? Have you tried the latest version of DN?
          Hide
          Konstantin Boudnik added a comment -

          Actually, it looks like 4900 isn't complete.

          Show
          Konstantin Boudnik added a comment - Actually, it looks like 4900 isn't complete.
          Hide
          Konstantin Boudnik added a comment -

          I meant HIVE-4900

          Show
          Konstantin Boudnik added a comment - I meant HIVE-4900
          Hide
          Xuefu Zhang added a comment -

          Konstantin Boudnik It's certainly possible, but could you please be more specific what is missing in HIVE-4900? Please feel free to provide a fix and attach your patch here so that we know what you're referring to.

          Show
          Xuefu Zhang added a comment - Konstantin Boudnik It's certainly possible, but could you please be more specific what is missing in HIVE-4900 ? Please feel free to provide a fix and attach your patch here so that we know what you're referring to.
          Hide
          Sergey Soldatov added a comment -

          Could you please try the attachmed fix. If it's still fail, could you please attach the logs.

          Show
          Sergey Soldatov added a comment - Could you please try the attachmed fix. If it's still fail, could you please attach the logs.
          Hide
          shanyu zhao added a comment -

          Sergey Soldatov I tried your patch but it doesn't fix the problem.

          The problem is not with user defined table, it's actually with the SEQUENCE_TABLE table which datanucleus uses internally to track the sequence number. I did a little bit research and found that it's a datanucleus bug, with my fix in datanucleus this problem went away.

          I submitted a JIRA there:
          http://www.datanucleus.org/servlet/jira/browse/NUCRDBMS-692

          However, for hive should we revert the datanucleus upgrade until the patch in NUCRDBMS-692 get checked in (3.2.7 the soonest)?

          Show
          shanyu zhao added a comment - Sergey Soldatov I tried your patch but it doesn't fix the problem. The problem is not with user defined table, it's actually with the SEQUENCE_TABLE table which datanucleus uses internally to track the sequence number. I did a little bit research and found that it's a datanucleus bug, with my fix in datanucleus this problem went away. I submitted a JIRA there: http://www.datanucleus.org/servlet/jira/browse/NUCRDBMS-692 However, for hive should we revert the datanucleus upgrade until the patch in NUCRDBMS-692 get checked in (3.2.7 the soonest)?
          Hide
          Konstantin Boudnik added a comment -

          Hmm, this is interesting... I have tried this patch on top of HIVE-4900 and seems to solve the problem with the version in datanucleus used by Hive atm. Running patched version of Hive with Postgres 9.2 as metastore shows no problem.

          Show
          Konstantin Boudnik added a comment - Hmm, this is interesting... I have tried this patch on top of HIVE-4900 and seems to solve the problem with the version in datanucleus used by Hive atm. Running patched version of Hive with Postgres 9.2 as metastore shows no problem.
          Hide
          Xuefu Zhang added a comment -

          However, for hive should we revert the datanucleus upgrade until the patch in NUCRDBMS-692 get checked in (3.2.7 the soonest)?

          Before we make a decision, I think we need to evaluate the impact and possible workarounds. Maybe there are some sort of configs that can be set. Also, I'm wondering if DN guys have confirmed the bug.

          Show
          Xuefu Zhang added a comment - However, for hive should we revert the datanucleus upgrade until the patch in NUCRDBMS-692 get checked in (3.2.7 the soonest)? Before we make a decision, I think we need to evaluate the impact and possible workarounds. Maybe there are some sort of configs that can be set. Also, I'm wondering if DN guys have confirmed the bug.
          Hide
          shanyu zhao added a comment -

          Konstantin BoudnikThis bug only happens with SQL Server as metastore.

          I attached a patch that downgrades datanucleus to 3.0.x.

          I understand that this version of datanucleus doesn't work with JDK7. But I think it's more important that we don't release a version of hive that doesn't work with SQL Server as metastore.

          Show
          shanyu zhao added a comment - Konstantin Boudnik This bug only happens with SQL Server as metastore. I attached a patch that downgrades datanucleus to 3.0.x. I understand that this version of datanucleus doesn't work with JDK7. But I think it's more important that we don't release a version of hive that doesn't work with SQL Server as metastore.
          Hide
          Brock Noland added a comment -

          I understand that this version of datanucleus doesn't work with JDK7. But I think it's more important that we don't release a version of hive that doesn't work with SQL Server as metastore.

          I don't agree. We don't even provide scripts for SQL Server:

          https://github.com/apache/hive/tree/trunk/metastore/scripts/upgrade

          whereas Java 6 is end of life.

          Show
          Brock Noland added a comment - I understand that this version of datanucleus doesn't work with JDK7. But I think it's more important that we don't release a version of hive that doesn't work with SQL Server as metastore. I don't agree. We don't even provide scripts for SQL Server: https://github.com/apache/hive/tree/trunk/metastore/scripts/upgrade whereas Java 6 is end of life .
          Hide
          shanyu zhao added a comment -

          Brock NolandThanks for your response. I understand that SQL Server specific scripts are not provided for database upgrade. But are you saying that SQL Server is NOT a supported database for metastore?

          Btw, SQL Server works fine with previous Hive releases, e.g. 0.11.0. So this will be a regression issue.

          Show
          shanyu zhao added a comment - Brock Noland Thanks for your response. I understand that SQL Server specific scripts are not provided for database upgrade. But are you saying that SQL Server is NOT a supported database for metastore? Btw, SQL Server works fine with previous Hive releases, e.g. 0.11.0. So this will be a regression issue.
          Hide
          Brock Noland added a comment -

          Hi,

          I don't want to block SQL Server support. However, SQL Server is just as supported as DB2, Informix, Sybase, SQLite, Teradata, Netezza, etc. It might work or it might not. There is no offical support for any database, all "support" is de facto. Given there are no sql scripts for SQL Server shipped with Hive we cannot even claim de facto support for SQL Server.

          I'd be more than happy to review SQL Server scripts and review any changes required to upgrade to a version of DN with SQL Server support fixed, but reverting a change that has widespread community support is the not the way towards SQL Server support.

          Until DN is able to provide a fix or workaround that we can incorporate in Hive, I suggest SQL Server users apply your patch on top of the Hive 0.12 release. Minor customization of Apache software is extremely common as all Apache releases are source code with the binaries provided only as a convenience to users.

          Brock

          Show
          Brock Noland added a comment - Hi, I don't want to block SQL Server support. However, SQL Server is just as supported as DB2, Informix, Sybase, SQLite, Teradata, Netezza, etc. It might work or it might not. There is no offical support for any database, all "support" is de facto . Given there are no sql scripts for SQL Server shipped with Hive we cannot even claim de facto support for SQL Server. I'd be more than happy to review SQL Server scripts and review any changes required to upgrade to a version of DN with SQL Server support fixed, but reverting a change that has widespread community support is the not the way towards SQL Server support. Until DN is able to provide a fix or workaround that we can incorporate in Hive, I suggest SQL Server users apply your patch on top of the Hive 0.12 release. Minor customization of Apache software is extremely common as all Apache releases are source code with the binaries provided only as a convenience to users. Brock
          Hide
          Konstantin Boudnik added a comment -

          Let's make this ticket clear - this is MS SQLServer, not any SQL server.

          Show
          Konstantin Boudnik added a comment - Let's make this ticket clear - this is MS SQLServer, not any SQL server.
          Hide
          shanyu zhao added a comment -

          Datanucleus patch was committed:
          http://www.datanucleus.org/servlet/jira/browse/NUCRDBMS-692

          So we can move to 3.2.7 when it is released. (according to their website, it's October 2013).

          Show
          shanyu zhao added a comment - Datanucleus patch was committed: http://www.datanucleus.org/servlet/jira/browse/NUCRDBMS-692 So we can move to 3.2.7 when it is released. (according to their website, it's October 2013).
          Hide
          Andy Jefferson added a comment -

          FYI 3.2.7 of datanucleus-rdbms is released

          Show
          Andy Jefferson added a comment - FYI 3.2.7 of datanucleus-rdbms is released
          Hide
          Brock Noland added a comment -

          Great! @shanyu, I'd be happy to review a patch upgrading to 3.2.7.

          Show
          Brock Noland added a comment - Great! @shanyu, I'd be happy to review a patch upgrading to 3.2.7.
          Hide
          shanyu zhao added a comment -

          Attaching a patch to upgrade datanucleus version that has the change to support MS SQLServer 2005 and later.

          Show
          shanyu zhao added a comment - Attaching a patch to upgrade datanucleus version that has the change to support MS SQLServer 2005 and later.
          Hide
          Brock Noland added a comment -

          Reuploading the patch with a correct name for testing.

          Show
          Brock Noland added a comment - Reuploading the patch with a correct name for testing.
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12610570/HIVE-5218.2.patch

          SUCCESS: +1 4502 tests passed

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1270/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1270/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12610570/HIVE-5218.2.patch SUCCESS: +1 4502 tests passed Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/1270/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/1270/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated.
          Hide
          shanyu zhao added a comment -

          Create a new patch that is applicable on trunk (because it moved to use maven)

          Show
          shanyu zhao added a comment - Create a new patch that is applicable on trunk (because it moved to use maven)
          Hide
          Brock Noland added a comment -

          Thank you Shanyu! +1 pending tests

          Show
          Brock Noland added a comment - Thank you Shanyu! +1 pending tests
          Hide
          Hive QA added a comment -

          Overall: -1 no tests executed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12612507/HIVE-5218-trunk.patch

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/173/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/173/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
          + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
          + cd /data/hive-ptest/working/
          + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-173/source-prep.txt
          + [[ false == \t\r\u\e ]]
          + mkdir -p maven ivy
          + [[ svn = \s\v\n ]]
          + [[ -n '' ]]
          + [[ -d apache-svn-trunk-source ]]
          + [[ ! -d apache-svn-trunk-source/.svn ]]
          + [[ ! -d apache-svn-trunk-source ]]
          + cd apache-svn-trunk-source
          + svn revert -R .
          Reverted 'hbase-handler/src/test/results/positive/hbase_stats2.q.out'
          Reverted 'hbase-handler/src/test/results/positive/hbase_stats3.q.out'
          Reverted 'hbase-handler/src/test/results/positive/hbase_stats.q.out'
          Reverted 'hbase-handler/src/test/results/positive/hbase_stats_empty_partition.q.out'
          Reverted 'metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreUtils.java'
          Reverted 'common/src/java/org/apache/hadoop/hive/common/StatsSetupConst.java'
          Reverted 'ql/src/test/results/clientnegative/unset_table_property.q.out'
          Reverted 'ql/src/test/results/clientnegative/stats_partialscan_autogether.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats8.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_num_buckets.q.out'
          Reverted 'ql/src/test/results/clientpositive/input_part7.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin5.q.out'
          Reverted 'ql/src/test/results/clientpositive/pcr.q.out'
          Reverted 'ql/src/test/results/clientpositive/show_tblproperties.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_map_operators.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats3.q.out'
          Reverted 'ql/src/test/results/clientpositive/join33.q.out'
          Reverted 'ql/src/test/results/clientpositive/input_part2.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_partition_coltype.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats_noscan_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_4.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_dyn_part.q.out'
          Reverted 'ql/src/test/results/clientpositive/load_dyn_part8.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample9.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_merge.q.out'
          Reverted 'ql/src/test/results/clientpositive/describe_table.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_map_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_sort_6.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample4.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats18.q.out'
          Reverted 'ql/src/test/results/clientpositive/push_or.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_7.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_sort_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats13.q.out'
          Reverted 'ql/src/test/results/clientpositive/udf_reflect2.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_11.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_convert_join.q.out'
          Reverted 'ql/src/test/results/clientpositive/rand_partitionpruner1.q.out'
          Reverted 'ql/src/test/results/clientpositive/combine2_hadoop20.q.out'
          Reverted 'ql/src/test/results/clientpositive/show_create_table_alter.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_2.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_join_reordering_values.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats_only_null.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket2.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_multi_insert.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_map_ppr_multi_distinct.q.out'
          Reverted 'ql/src/test/results/clientpositive/parallel_orderby.q.out'
          Reverted 'ql/src/test/results/clientpositive/filter_join_breaktask.q.out'
          Reverted 'ql/src/test/results/clientpositive/sort_merge_join_desc_5.q.out'
          Reverted 'ql/src/test/results/clientpositive/join17.q.out'
          Reverted 'ql/src/test/results/clientpositive/input_part9.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin7.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_table_serde2.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin11.q.out'
          Reverted 'ql/src/test/results/clientpositive/join26.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative.q.out'
          Reverted 'ql/src/test/results/clientpositive/rcfile_default_format.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats5.q.out'
          Reverted 'ql/src/test/results/clientpositive/ppd_join_filter.q.out'
          Reverted 'ql/src/test/results/clientpositive/join35.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin2.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_partition_clusterby_sortby.q.out'
          Reverted 'ql/src/test/results/clientpositive/join_map_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats0.q.out'
          Reverted 'ql/src/test/results/clientpositive/join9.q.out'
          Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_11.q.out'
          Reverted 'ql/src/test/results/clientpositive/ppr_allchildsarenull.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample6.q.out'
          Reverted 'ql/src/test/results/clientpositive/join_filters_overlap.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats_empty_partition.q.out'
          Reverted 'ql/src/test/results/clientpositive/create_alter_list_bucketing_table1.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket_map_join_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample1.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats15.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats_partscan_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/reduce_deduplicate.q.out'
          Reverted 'ql/src/test/results/clientpositive/rand_partitionpruner3.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_4.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats10.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket4.q.out'
          Reverted 'ql/src/test/results/clientpositive/udtf_explode.q.out'
          Reverted 'ql/src/test/results/clientpositive/merge3.q.out'
          Reverted 'ql/src/test/results/clientpositive/sort_merge_join_desc_7.q.out'
          Reverted 'ql/src/test/results/clientpositive/binary_output_format.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin9.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin13.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats7.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketizedhiveinputformat.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin4.q.out'
          Reverted 'ql/src/test/results/clientpositive/union22.q.out'
          Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_13.q.out'
          Reverted 'ql/src/test/results/clientpositive/unset_table_view_property.q.out'
          Reverted 'ql/src/test/results/clientpositive/join32.q.out'
          Reverted 'ql/src/test/results/clientpositive/ctas_uses_database_location.q.out'
          Reverted 'ql/src/test/results/clientpositive/input_part1.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_sort_skew_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_8.q.out'
          Reverted 'ql/src/test/results/clientpositive/columnstats_partlvl.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_3.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative3.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample8.q.out'
          Reverted 'ql/src/test/results/clientpositive/transform_ppr2.q.out'
          Reverted 'ql/src/test/results/clientpositive/union_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/serde_user_properties.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_table_not_sorted.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_numbuckets_partitioned_table.q.out'
          Reverted 'ql/src/test/results/clientpositive/ctas_hadoop20.q.out'
          Reverted 'ql/src/test/results/clientpositive/ppd_vc.q.out'
          Reverted 'ql/src/test/results/clientpositive/dynamic_partition_skip_default.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_6.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_bucketed_table.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats12.q.out'
          Reverted 'ql/src/test/results/clientpositive/router_join_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_1.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket1.q.out'
          Reverted 'ql/src/test/results/clientpositive/input42.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats9.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_grouping_operators.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin10.q.out'
          Reverted 'ql/src/test/results/clientpositive/union24.q.out'
          Reverted 'ql/src/test/results/clientpositive/metadata_only_queries.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats4.q.out'
          Reverted 'ql/src/test/results/clientpositive/columnstats_tbllvl.q.out'
          Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_15.q.out'
          Reverted 'ql/src/test/results/clientpositive/join34.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin1.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample10.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats_noscan_2.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_5.q.out'
          Reverted 'ql/src/test/results/clientpositive/louter_join_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample5.q.out'
          Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_reducers_power_two.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats19.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_8.q.out'
          Reverted 'ql/src/test/results/clientpositive/udf_explode.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_numbuckets_partitioned_table2.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats14.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_12.q.out'
          Reverted 'ql/src/test/results/clientpositive/rand_partitionpruner2.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_3.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket3.q.out'
          Reverted 'ql/src/test/results/clientpositive/groupby_ppr_multi_distinct.q.out'
          Reverted 'ql/src/test/results/clientpositive/sort_merge_join_desc_6.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin8.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin12.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats6.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin3.q.out'
          Reverted 'ql/src/test/results/clientpositive/alter_skewed_table.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats1.q.out'
          Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_12.q.out'
          Reverted 'ql/src/test/results/clientpositive/join32_lessSize.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_7.q.out'
          Reverted 'ql/src/test/results/clientpositive/outer_join_ppr.q.out'
          Reverted 'ql/src/test/results/clientpositive/list_bucket_dml_10.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative2.q.out'
          Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_2.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample7.q.out'
          Reverted 'ql/src/test/results/clientpositive/transform_ppr1.q.out'
          Reverted 'ql/src/test/results/clientpositive/regexp_extract.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket_map_join_2.q.out'
          Reverted 'ql/src/test/results/clientpositive/sample2.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats16.q.out'
          Reverted 'ql/src/test/results/clientpositive/disable_merge_for_bucketing.q.out'
          Reverted 'ql/src/test/results/clientpositive/ppd_union_view.q.out'
          Reverted 'ql/src/test/results/clientpositive/ctas_colname.q.out'
          Reverted 'ql/src/test/results/clientpositive/truncate_column.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucketcontext_5.q.out'
          Reverted 'ql/src/test/results/clientpositive/describe_comment_nonascii.q.out'
          Reverted 'ql/src/test/results/clientpositive/stats11.q.out'
          Reverted 'ql/src/test/results/clientpositive/bucket5.q.out'
          Reverted 'ql/src/test/results/clientpositive/input23.q.out'
          Reverted 'ql/src/test/results/compiler/plan/join2.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input2.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join3.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input3.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join4.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input4.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join5.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input5.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join6.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input_testxpath2.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input6.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join7.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input7.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join8.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input_testsequencefile.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input8.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input9.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/union.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/udf1.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input_testxpath.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/udf6.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input_part1.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/groupby1.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/groupby2.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/udf_case.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/groupby3.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/subq.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/groupby4.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/groupby5.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/groupby6.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/case_sensitivity.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/udf_when.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input20.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample1.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample2.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample3.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample4.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample5.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample6.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/sample7.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/cast1.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/join1.q.xml'
          Reverted 'ql/src/test/results/compiler/plan/input1.q.xml'
          Reverted 'ql/src/test/queries/clientpositive/stats_only_null.q'
          Reverted 'ql/src/java/org/apache/hadoop/hive/ql/optimizer/StatsOptimizer.java'
          Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java'
          Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/StatsTask.java'
          ++ awk '{print $2}'
          ++ egrep -v '^X|^Performing status on external'
          ++ svn status --no-ignore
          + rm -rf target datanucleus.log ant/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target metastore/target common/target common/src/gen serde/target ql/src/test/results/clientpositive/stats_invalidation.q.out ql/src/test/queries/clientpositive/stats_invalidation.q
          + svn update
          
          Fetching external item into 'hcatalog/src/test/e2e/harness'
          External at revision 1539718.
          
          At revision 1539718.
          + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
          + patchFilePath=/data/hive-ptest/working/scratch/build.patch
          + [[ -f /data/hive-ptest/working/scratch/build.patch ]]
          + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
          + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
          Going to apply patch with: patch -p0
          (Stripping trailing CRs from patch.)
          patching file pom.xml
          Hunk #1 succeeded at 81 (offset 19 lines).
          + [[ maven == \m\a\v\e\n ]]
          + rm -rf /data/hive-ptest/working/maven/org/apache/hive
          + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
          [INFO] Scanning for projects...
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Build Order:
          [INFO] 
          [INFO] Hive
          [INFO] Hive Ant Utilities
          [INFO] Hive Shims Common
          [INFO] Hive Shims 0.20
          [INFO] Hive Shims Secure Common
          [INFO] Hive Shims 0.20S
          [INFO] Hive Shims 0.23
          [INFO] Hive Shims
          [INFO] Hive Common
          [INFO] Hive Serde
          [INFO] Hive Metastore
          [INFO] Hive Query Language
          [INFO] Hive Service
          [INFO] Hive JDBC
          [INFO] Hive Beeline
          [INFO] Hive CLI
          [INFO] Hive Contrib
          [INFO] Hive HBase Handler
          [INFO] Hive HCatalog
          [INFO] Hive HCatalog Core
          [INFO] Hive HCatalog Pig Adapter
          [INFO] Hive HCatalog Server Extensions
          [INFO] Hive HCatalog Webhcat Java Client
          [INFO] Hive HCatalog Webhcat
          [INFO] Hive HCatalog HBase Storage Handler
          [INFO] Hive HWI
          [INFO] Hive ODBC
          [INFO] Hive Shims Aggregator
          [INFO] Hive TestUtils
          [INFO] Hive Packaging
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
          [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
          [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
          [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
          [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Shims 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
          [WARNING] JAR will be empty - no content was marked for inclusion!
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
          [INFO] Reading assembly descriptor: src/assemble/uberjar.xml
          [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [INFO] META-INF/MANIFEST.MF already added, skipping
          [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
          Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
          NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
          [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
          with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Common 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
          [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 4 resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
          [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Serde 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
          [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
          [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Metastore 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.pom
          Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.pom
          Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.pom (10 KB at 242.5 KB/sec)
          Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.pom
          Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.pom
          Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.pom (14 KB at 569.1 KB/sec)
          Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.pom
          Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.pom
          Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.pom (13 KB at 429.8 KB/sec)
          Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar
          Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar
          Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar
          Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar
          Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar
          Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar
          Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar (332 KB at 1303.3 KB/sec)
          Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar (1731 KB at 5001.6 KB/sec)
          Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar (1773 KB at 5036.9 KB/sec)
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
          [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
          [INFO] 
          [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
          [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
          ANTLR Parser Generator  Version 3.4
          org/apache/hadoop/hive/metastore/parser/Filter.g
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 1 resource
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
          [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: Some input files use unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
          [INFO] DataNucleus Enhancer (version 3.2.8) for API "JDO" using JRE "1.6"
          DataNucleus Enhancer : Classpath
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
          >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
          >>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
          >>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
          >>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
          >>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
          >>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
          >>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.1/avro-1.7.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar
          >>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
          >>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.3/zookeeper-3.4.3.jar
          >>  /data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar
          >>  /data/hive-ptest/working/maven/org/jboss/netty/netty/3.2.2.Final/netty-3.2.2.Final.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
          >>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
          >>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
          >>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
          >>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
          >>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
          >>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar
          >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar
          >>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
          >>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
          >>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
          >>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
          >>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1.3/httpclient-4.1.3.jar
          >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.1.3/httpcore-4.1.3.jar
          >>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
          >>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.8/jersey-core-1.8.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.8/jersey-json-1.8.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
          >>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
          >>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
          >>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
          >>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
          >>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.7.1/jackson-jaxrs-1.7.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.7.1/jackson-xc-1.7.1.jar
          >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.8/jersey-server-1.8.jar
          >>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
          >>  /data/hive-ptest/working/maven/commons-io/commons-io/2.1/commons-io-2.1.jar
          >>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
          >>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
          >>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
          >>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
          >>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
          >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
          >>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
          >>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
          >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
          >>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
          >>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
          >>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
          >>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
          >>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
          >>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
          >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar
          >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar
          >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
          ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
          DataNucleus Enhancer completed with success for 25 classes. Timings : input=592 ms, enhance=926 ms, total=1518 ms. Consult the log for full details
          
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
               [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
          [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Query Language 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive .............................................. SUCCESS [2.514s]
          [INFO] Hive Ant Utilities ................................ SUCCESS [6.814s]
          [INFO] Hive Shims Common ................................. SUCCESS [2.807s]
          [INFO] Hive Shims 0.20 ................................... SUCCESS [1.785s]
          [INFO] Hive Shims Secure Common .......................... SUCCESS [3.234s]
          [INFO] Hive Shims 0.20S .................................. SUCCESS [1.388s]
          [INFO] Hive Shims 0.23 ................................... SUCCESS [3.196s]
          [INFO] Hive Shims ........................................ SUCCESS [3.323s]
          [INFO] Hive Common ....................................... SUCCESS [4.364s]
          [INFO] Hive Serde ........................................ SUCCESS [11.472s]
          [INFO] Hive Metastore .................................... SUCCESS [26.922s]
          [INFO] Hive Query Language ............................... FAILURE [0.612s]
          [INFO] Hive Service ...................................... SKIPPED
          [INFO] Hive JDBC ......................................... SKIPPED
          [INFO] Hive Beeline ...................................... SKIPPED
          [INFO] Hive CLI .......................................... SKIPPED
          [INFO] Hive Contrib ...................................... SKIPPED
          [INFO] Hive HBase Handler ................................ SKIPPED
          [INFO] Hive HCatalog ..................................... SKIPPED
          [INFO] Hive HCatalog Core ................................ SKIPPED
          [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
          [INFO] Hive HCatalog Server Extensions ................... SKIPPED
          [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
          [INFO] Hive HCatalog Webhcat ............................. SKIPPED
          [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
          [INFO] Hive HWI .......................................... SKIPPED
          [INFO] Hive ODBC ......................................... SKIPPED
          [INFO] Hive Shims Aggregator ............................. SKIPPED
          [INFO] Hive TestUtils .................................... SKIPPED
          [INFO] Hive Packaging .................................... SKIPPED
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 1:10.700s
          [INFO] Finished at: Thu Nov 07 12:12:01 EST 2013
          [INFO] Final Memory: 42M/192M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal on project hive-exec: Could not resolve dependencies for project org.apache.hive:hive-exec:jar:0.13.0-SNAPSHOT: Could not find artifact org.apache.hive:hive-shims:jar:uberjar:0.13.0-SNAPSHOT -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-exec
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12612507

          Show
          Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12612507/HIVE-5218-trunk.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/173/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/173/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-173/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . Reverted 'hbase-handler/src/test/results/positive/hbase_stats2.q.out' Reverted 'hbase-handler/src/test/results/positive/hbase_stats3.q.out' Reverted 'hbase-handler/src/test/results/positive/hbase_stats.q.out' Reverted 'hbase-handler/src/test/results/positive/hbase_stats_empty_partition.q.out' Reverted 'metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreUtils.java' Reverted 'common/src/java/org/apache/hadoop/hive/common/StatsSetupConst.java' Reverted 'ql/src/test/results/clientnegative/unset_table_property.q.out' Reverted 'ql/src/test/results/clientnegative/stats_partialscan_autogether.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/stats8.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_num_buckets.q.out' Reverted 'ql/src/test/results/clientpositive/input_part7.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin5.q.out' Reverted 'ql/src/test/results/clientpositive/pcr.q.out' Reverted 'ql/src/test/results/clientpositive/show_tblproperties.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_map_operators.q.out' Reverted 'ql/src/test/results/clientpositive/stats3.q.out' Reverted 'ql/src/test/results/clientpositive/join33.q.out' Reverted 'ql/src/test/results/clientpositive/input_part2.q.out' Reverted 'ql/src/test/results/clientpositive/alter_partition_coltype.q.out' Reverted 'ql/src/test/results/clientpositive/stats_noscan_1.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_4.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_dyn_part.q.out' Reverted 'ql/src/test/results/clientpositive/load_dyn_part8.q.out' Reverted 'ql/src/test/results/clientpositive/sample9.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_merge.q.out' Reverted 'ql/src/test/results/clientpositive/describe_table.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_map_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_sort_6.q.out' Reverted 'ql/src/test/results/clientpositive/sample4.q.out' Reverted 'ql/src/test/results/clientpositive/stats18.q.out' Reverted 'ql/src/test/results/clientpositive/push_or.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_7.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_sort_1.q.out' Reverted 'ql/src/test/results/clientpositive/stats13.q.out' Reverted 'ql/src/test/results/clientpositive/udf_reflect2.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_11.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_convert_join.q.out' Reverted 'ql/src/test/results/clientpositive/rand_partitionpruner1.q.out' Reverted 'ql/src/test/results/clientpositive/combine2_hadoop20.q.out' Reverted 'ql/src/test/results/clientpositive/show_create_table_alter.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_2.q.out' Reverted 'ql/src/test/results/clientpositive/auto_join_reordering_values.q.out' Reverted 'ql/src/test/results/clientpositive/stats_only_null.q.out' Reverted 'ql/src/test/results/clientpositive/bucket2.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_multi_insert.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_map_ppr_multi_distinct.q.out' Reverted 'ql/src/test/results/clientpositive/parallel_orderby.q.out' Reverted 'ql/src/test/results/clientpositive/filter_join_breaktask.q.out' Reverted 'ql/src/test/results/clientpositive/sort_merge_join_desc_5.q.out' Reverted 'ql/src/test/results/clientpositive/join17.q.out' Reverted 'ql/src/test/results/clientpositive/input_part9.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin7.q.out' Reverted 'ql/src/test/results/clientpositive/alter_table_serde2.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin11.q.out' Reverted 'ql/src/test/results/clientpositive/join26.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative.q.out' Reverted 'ql/src/test/results/clientpositive/rcfile_default_format.q.out' Reverted 'ql/src/test/results/clientpositive/stats5.q.out' Reverted 'ql/src/test/results/clientpositive/ppd_join_filter.q.out' Reverted 'ql/src/test/results/clientpositive/join35.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin2.q.out' Reverted 'ql/src/test/results/clientpositive/alter_partition_clusterby_sortby.q.out' Reverted 'ql/src/test/results/clientpositive/join_map_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/stats0.q.out' Reverted 'ql/src/test/results/clientpositive/join9.q.out' Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_11.q.out' Reverted 'ql/src/test/results/clientpositive/ppr_allchildsarenull.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_1.q.out' Reverted 'ql/src/test/results/clientpositive/sample6.q.out' Reverted 'ql/src/test/results/clientpositive/join_filters_overlap.q.out' Reverted 'ql/src/test/results/clientpositive/stats_empty_partition.q.out' Reverted 'ql/src/test/results/clientpositive/create_alter_list_bucketing_table1.q.out' Reverted 'ql/src/test/results/clientpositive/bucket_map_join_1.q.out' Reverted 'ql/src/test/results/clientpositive/sample1.q.out' Reverted 'ql/src/test/results/clientpositive/stats15.q.out' Reverted 'ql/src/test/results/clientpositive/stats_partscan_1.q.out' Reverted 'ql/src/test/results/clientpositive/reduce_deduplicate.q.out' Reverted 'ql/src/test/results/clientpositive/rand_partitionpruner3.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_4.q.out' Reverted 'ql/src/test/results/clientpositive/stats10.q.out' Reverted 'ql/src/test/results/clientpositive/bucket4.q.out' Reverted 'ql/src/test/results/clientpositive/udtf_explode.q.out' Reverted 'ql/src/test/results/clientpositive/merge3.q.out' Reverted 'ql/src/test/results/clientpositive/sort_merge_join_desc_7.q.out' Reverted 'ql/src/test/results/clientpositive/binary_output_format.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin9.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin13.q.out' Reverted 'ql/src/test/results/clientpositive/stats7.q.out' Reverted 'ql/src/test/results/clientpositive/bucketizedhiveinputformat.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin4.q.out' Reverted 'ql/src/test/results/clientpositive/union22.q.out' Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_13.q.out' Reverted 'ql/src/test/results/clientpositive/unset_table_view_property.q.out' Reverted 'ql/src/test/results/clientpositive/join32.q.out' Reverted 'ql/src/test/results/clientpositive/ctas_uses_database_location.q.out' Reverted 'ql/src/test/results/clientpositive/input_part1.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_sort_skew_1.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_8.q.out' Reverted 'ql/src/test/results/clientpositive/columnstats_partlvl.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_3.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative3.q.out' Reverted 'ql/src/test/results/clientpositive/sample8.q.out' Reverted 'ql/src/test/results/clientpositive/transform_ppr2.q.out' Reverted 'ql/src/test/results/clientpositive/union_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/serde_user_properties.q.out' Reverted 'ql/src/test/results/clientpositive/alter_table_not_sorted.q.out' Reverted 'ql/src/test/results/clientpositive/alter_numbuckets_partitioned_table.q.out' Reverted 'ql/src/test/results/clientpositive/ctas_hadoop20.q.out' Reverted 'ql/src/test/results/clientpositive/ppd_vc.q.out' Reverted 'ql/src/test/results/clientpositive/dynamic_partition_skip_default.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_6.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_bucketed_table.q.out' Reverted 'ql/src/test/results/clientpositive/stats12.q.out' Reverted 'ql/src/test/results/clientpositive/router_join_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_1.q.out' Reverted 'ql/src/test/results/clientpositive/bucket1.q.out' Reverted 'ql/src/test/results/clientpositive/input42.q.out' Reverted 'ql/src/test/results/clientpositive/stats9.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_grouping_operators.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin10.q.out' Reverted 'ql/src/test/results/clientpositive/union24.q.out' Reverted 'ql/src/test/results/clientpositive/metadata_only_queries.q.out' Reverted 'ql/src/test/results/clientpositive/stats4.q.out' Reverted 'ql/src/test/results/clientpositive/columnstats_tbllvl.q.out' Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_15.q.out' Reverted 'ql/src/test/results/clientpositive/join34.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin1.q.out' Reverted 'ql/src/test/results/clientpositive/sample10.q.out' Reverted 'ql/src/test/results/clientpositive/stats_noscan_2.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_5.q.out' Reverted 'ql/src/test/results/clientpositive/louter_join_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/sample5.q.out' Reverted 'ql/src/test/results/clientpositive/infer_bucket_sort_reducers_power_two.q.out' Reverted 'ql/src/test/results/clientpositive/stats19.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_8.q.out' Reverted 'ql/src/test/results/clientpositive/udf_explode.q.out' Reverted 'ql/src/test/results/clientpositive/alter_numbuckets_partitioned_table2.q.out' Reverted 'ql/src/test/results/clientpositive/stats14.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_12.q.out' Reverted 'ql/src/test/results/clientpositive/rand_partitionpruner2.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_3.q.out' Reverted 'ql/src/test/results/clientpositive/bucket3.q.out' Reverted 'ql/src/test/results/clientpositive/groupby_ppr_multi_distinct.q.out' Reverted 'ql/src/test/results/clientpositive/sort_merge_join_desc_6.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin8.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin12.q.out' Reverted 'ql/src/test/results/clientpositive/stats6.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin3.q.out' Reverted 'ql/src/test/results/clientpositive/alter_skewed_table.q.out' Reverted 'ql/src/test/results/clientpositive/stats1.q.out' Reverted 'ql/src/test/results/clientpositive/smb_mapjoin_12.q.out' Reverted 'ql/src/test/results/clientpositive/join32_lessSize.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_7.q.out' Reverted 'ql/src/test/results/clientpositive/outer_join_ppr.q.out' Reverted 'ql/src/test/results/clientpositive/list_bucket_dml_10.q.out' Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative2.q.out' Reverted 'ql/src/test/results/clientpositive/auto_sortmerge_join_2.q.out' Reverted 'ql/src/test/results/clientpositive/sample7.q.out' Reverted 'ql/src/test/results/clientpositive/transform_ppr1.q.out' Reverted 'ql/src/test/results/clientpositive/regexp_extract.q.out' Reverted 'ql/src/test/results/clientpositive/bucket_map_join_2.q.out' Reverted 'ql/src/test/results/clientpositive/sample2.q.out' Reverted 'ql/src/test/results/clientpositive/stats16.q.out' Reverted 'ql/src/test/results/clientpositive/disable_merge_for_bucketing.q.out' Reverted 'ql/src/test/results/clientpositive/ppd_union_view.q.out' Reverted 'ql/src/test/results/clientpositive/ctas_colname.q.out' Reverted 'ql/src/test/results/clientpositive/truncate_column.q.out' Reverted 'ql/src/test/results/clientpositive/bucketcontext_5.q.out' Reverted 'ql/src/test/results/clientpositive/describe_comment_nonascii.q.out' Reverted 'ql/src/test/results/clientpositive/stats11.q.out' Reverted 'ql/src/test/results/clientpositive/bucket5.q.out' Reverted 'ql/src/test/results/clientpositive/input23.q.out' Reverted 'ql/src/test/results/compiler/plan/join2.q.xml' Reverted 'ql/src/test/results/compiler/plan/input2.q.xml' Reverted 'ql/src/test/results/compiler/plan/join3.q.xml' Reverted 'ql/src/test/results/compiler/plan/input3.q.xml' Reverted 'ql/src/test/results/compiler/plan/join4.q.xml' Reverted 'ql/src/test/results/compiler/plan/input4.q.xml' Reverted 'ql/src/test/results/compiler/plan/join5.q.xml' Reverted 'ql/src/test/results/compiler/plan/input5.q.xml' Reverted 'ql/src/test/results/compiler/plan/join6.q.xml' Reverted 'ql/src/test/results/compiler/plan/input_testxpath2.q.xml' Reverted 'ql/src/test/results/compiler/plan/input6.q.xml' Reverted 'ql/src/test/results/compiler/plan/join7.q.xml' Reverted 'ql/src/test/results/compiler/plan/input7.q.xml' Reverted 'ql/src/test/results/compiler/plan/join8.q.xml' Reverted 'ql/src/test/results/compiler/plan/input_testsequencefile.q.xml' Reverted 'ql/src/test/results/compiler/plan/input8.q.xml' Reverted 'ql/src/test/results/compiler/plan/input9.q.xml' Reverted 'ql/src/test/results/compiler/plan/union.q.xml' Reverted 'ql/src/test/results/compiler/plan/udf1.q.xml' Reverted 'ql/src/test/results/compiler/plan/input_testxpath.q.xml' Reverted 'ql/src/test/results/compiler/plan/udf6.q.xml' Reverted 'ql/src/test/results/compiler/plan/input_part1.q.xml' Reverted 'ql/src/test/results/compiler/plan/groupby1.q.xml' Reverted 'ql/src/test/results/compiler/plan/groupby2.q.xml' Reverted 'ql/src/test/results/compiler/plan/udf_case.q.xml' Reverted 'ql/src/test/results/compiler/plan/groupby3.q.xml' Reverted 'ql/src/test/results/compiler/plan/subq.q.xml' Reverted 'ql/src/test/results/compiler/plan/groupby4.q.xml' Reverted 'ql/src/test/results/compiler/plan/groupby5.q.xml' Reverted 'ql/src/test/results/compiler/plan/groupby6.q.xml' Reverted 'ql/src/test/results/compiler/plan/case_sensitivity.q.xml' Reverted 'ql/src/test/results/compiler/plan/udf_when.q.xml' Reverted 'ql/src/test/results/compiler/plan/input20.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample1.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample2.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample3.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample4.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample5.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample6.q.xml' Reverted 'ql/src/test/results/compiler/plan/sample7.q.xml' Reverted 'ql/src/test/results/compiler/plan/cast1.q.xml' Reverted 'ql/src/test/results/compiler/plan/join1.q.xml' Reverted 'ql/src/test/results/compiler/plan/input1.q.xml' Reverted 'ql/src/test/queries/clientpositive/stats_only_null.q' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/optimizer/StatsOptimizer.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/StatsTask.java' ++ awk '{print $2}' ++ egrep -v '^X|^Performing status on external' ++ svn status --no-ignore + rm -rf target datanucleus.log ant/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target metastore/target common/target common/src/gen serde/target ql/src/test/results/clientpositive/stats_invalidation.q.out ql/src/test/queries/clientpositive/stats_invalidation.q + svn update Fetching external item into 'hcatalog/src/test/e2e/harness' External at revision 1539718. At revision 1539718. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch Going to apply patch with: patch -p0 (Stripping trailing CRs from patch.) patching file pom.xml Hunk #1 succeeded at 81 (offset 19 lines). + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant --- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure --- [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims --- [INFO] Reading assembly descriptor: src/assemble/uberjar.xml [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion. [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing. Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact. NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic! [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde --- [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde --- [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.pom Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.pom Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.pom (10 KB at 242.5 KB/sec) Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.pom Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.pom Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.pom (14 KB at 569.1 KB/sec) Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.pom Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.pom Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.pom (13 KB at 429.8 KB/sec) Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar Downloading: http://www.datanucleus.org/downloads/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar Downloading: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar (332 KB at 1303.3 KB/sec) Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar (1731 KB at 5001.6 KB/sec) Downloaded: http://repo.maven.apache.org/maven2/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar (1773 KB at 5036.9 KB/sec) [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/metastore/parser/Filter.g [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore --- [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore --- [INFO] DataNucleus Enhancer (version 3.2.8) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.8/datanucleus-core-3.2.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.1/avro-1.7.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.3/zookeeper-3.4.3.jar >> /data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar >> /data/hive-ptest/working/maven/org/jboss/netty/netty/3.2.2.Final/netty-3.2.2.Final.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.4/datanucleus-api-jdo-3.2.4.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.7/datanucleus-rdbms-3.2.7.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1.3/httpclient-4.1.3.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.1.3/httpcore-4.1.3.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.8/jersey-core-1.8.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.8/jersey-json-1.8.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.7.1/jackson-jaxrs-1.7.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.7.1/jackson-xc-1.7.1.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.8/jersey-server-1.8.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.1/commons-io-2.1.jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input=592 ms, enhance=926 ms, total=1518 ms. Consult the log for full details [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [2.514s] [INFO] Hive Ant Utilities ................................ SUCCESS [6.814s] [INFO] Hive Shims Common ................................. SUCCESS [2.807s] [INFO] Hive Shims 0.20 ................................... SUCCESS [1.785s] [INFO] Hive Shims Secure Common .......................... SUCCESS [3.234s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.388s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.196s] [INFO] Hive Shims ........................................ SUCCESS [3.323s] [INFO] Hive Common ....................................... SUCCESS [4.364s] [INFO] Hive Serde ........................................ SUCCESS [11.472s] [INFO] Hive Metastore .................................... SUCCESS [26.922s] [INFO] Hive Query Language ............................... FAILURE [0.612s] [INFO] Hive Service ...................................... SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:10.700s [INFO] Finished at: Thu Nov 07 12:12:01 EST 2013 [INFO] Final Memory: 42M/192M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal on project hive-exec: Could not resolve dependencies for project org.apache.hive:hive-exec:jar:0.13.0-SNAPSHOT: Could not find artifact org.apache.hive:hive-shims:jar:uberjar:0.13.0-SNAPSHOT -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12612507
          Hide
          Thejas M Nair added a comment -

          reattaching patch to run tests

          Show
          Thejas M Nair added a comment - reattaching patch to run tests
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12613423/HIVE-5218-trunk.patch

          SUCCESS: +1 4604 tests passed

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/261/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/261/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          ATTACHMENT ID: 12613423

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12613423/HIVE-5218-trunk.patch SUCCESS: +1 4604 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/261/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/261/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12613423
          Hide
          Xuefu Zhang added a comment -

          I'd just like to point out that passing unit tests might not be enough, as these tests are against embedded derby. There is a way to hack into the build such that tests are running against other DBs such as MySQL. Also, I'd think manual testing on other aspects might be necessary, such as existing metastore from previous release(s), upgrade path, etc.

          Show
          Xuefu Zhang added a comment - I'd just like to point out that passing unit tests might not be enough, as these tests are against embedded derby. There is a way to hack into the build such that tests are running against other DBs such as MySQL. Also, I'd think manual testing on other aspects might be necessary, such as existing metastore from previous release(s), upgrade path, etc.
          Hide
          Thejas M Nair added a comment -

          Xuefu Zhang This is just a minor version upgrade of datanucleus (from one 3.2.x to newer 3.2.x version), and I would expect datanucleus to have tests running against various databases (specially with mysql).
          Have you run the unit tests with mysql before ? Is it just matter to changing data/conf/hive-site.xml ?

          Show
          Thejas M Nair added a comment - Xuefu Zhang This is just a minor version upgrade of datanucleus (from one 3.2.x to newer 3.2.x version), and I would expect datanucleus to have tests running against various databases (specially with mysql). Have you run the unit tests with mysql before ? Is it just matter to changing data/conf/hive-site.xml ?
          Hide
          Xuefu Zhang added a comment -

          Thejas M Nair I think you're right that this is a minor upgrade and I expect that DN has its test with MySQL. My concern is more about whether the new DN works for hive + other DB. In HIVE-3632, we actually found problems during manual testing and some flags were set on hive side in order to work with meta created in previous releases. It's unlikely that we will have similar issue, but it might be good to be on the safe side.

          Yes. it's matter of changing the config.

          Show
          Xuefu Zhang added a comment - Thejas M Nair I think you're right that this is a minor upgrade and I expect that DN has its test with MySQL. My concern is more about whether the new DN works for hive + other DB. In HIVE-3632 , we actually found problems during manual testing and some flags were set on hive side in order to work with meta created in previous releases. It's unlikely that we will have similar issue, but it might be good to be on the safe side. Yes. it's matter of changing the config.
          Hide
          Thejas M Nair added a comment -

          Marking as duplicate. HIVE-5099 has patch that upgrades to a newer datanucleus version.

          Show
          Thejas M Nair added a comment - Marking as duplicate. HIVE-5099 has patch that upgrades to a newer datanucleus version.

            People

            • Assignee:
              shanyu zhao
              Reporter:
              shanyu zhao
            • Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development