Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 0.14.0
    • 2.4.0, 3.0.0
    • Transactions
    • None

    Description

      create table T(a int, b int) clustered by (a)  into 2 buckets stored as orc TBLPROPERTIES('transactional'='false')
      insert into T(a,b) values(1,2)
      insert into T(a,b) values(1,3)
      alter table T SET TBLPROPERTIES ('transactional'='true')
      

      //we should now have bucket files 000001_0 and 000001_0_copy_1

      but OrcRawRecordMerger.OriginalReaderPair.next() doesn't know that there can be copy_N files and numbers rows in each bucket from 0 thus generating duplicate IDs

      select ROW__ID, INPUT__FILE__NAME, a, b from T
      

      produces

      {"transactionid":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0,1,2
      {"transactionid\":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0_copy_1,1,3
      

      [~owen.omalley], do you have any thoughts on a good way to handle this?

      attached patch has a few changes to make Acid even recognize copy_N but this is just a pre-requisite. The new UT demonstrates the issue.

      Futhermore,

      alter table T compact 'major'
      select ROW__ID, INPUT__FILE__NAME, a, b from T order by b
      

      produces

      {"transactionid":0,"bucketid":1,"rowid":0}	file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands....warehouse/nonacidorctbl/base_-9223372036854775808/bucket_00001	1	2
      

      HIVE-16177.04.patch has TestTxnCommands.testNonAcidToAcidConversion0() demonstrating this

      This is because compactor doesn't handle copy_N files either (skips them)

      Attachments

        1. HIVE-16177.20-branch-2.patch
          97 kB
          Eugene Koifman
        2. HIVE-16177.19-branch-2.patch
          97 kB
          Eugene Koifman
        3. HIVE-16177.18-branch-2.patch
          95 kB
          Eugene Koifman
        4. HIVE-16177.18.patch
          96 kB
          Eugene Koifman
        5. HIVE-16177.17.patch
          96 kB
          Eugene Koifman
        6. HIVE-16177.16.patch
          100 kB
          Eugene Koifman
        7. HIVE-16177.15.patch
          100 kB
          Eugene Koifman
        8. HIVE-16177.14.patch
          74 kB
          Eugene Koifman
        9. HIVE-16177.11.patch
          96 kB
          Eugene Koifman
        10. HIVE-16177.10.patch
          96 kB
          Eugene Koifman
        11. HIVE-16177.09.patch
          97 kB
          Eugene Koifman
        12. HIVE-16177.08.patch
          85 kB
          Eugene Koifman
        13. HIVE-16177.07.patch
          79 kB
          Eugene Koifman
        14. HIVE-16177.04.patch
          43 kB
          Eugene Koifman
        15. HIVE-16177.02.patch
          9 kB
          Eugene Koifman
        16. HIVE-16177.01.patch
          8 kB
          Eugene Koifman

        Issue Links

        Activity

          ekoifman Eugene Koifman created issue -
          ekoifman Eugene Koifman made changes -
          Field Original Value New Value
          Attachment HIVE-16177.01.patch [ 12857391 ]
          ekoifman Eugene Koifman made changes -
          Description insert into T(a,b) values(1,2)
          insert into T(a,b) values(1,3)

              //we should now have bucket files 000001_0 and 000001_0_copy_1

          but OrcRawRecordMerger.OriginalReaderPair.next() doesn't know that there can be copy_N files and numbers rows in each bucket from 0 thus generating duplicate IDs


          [~owen.omalley], do you have any thoughts on a good way to handle this?

          attached patch has a few changes to make Acid even recognize copy_N but this is just a pre-requisite. The new UT demonstrates the issue.
          {noformat}
          create table T(a int, b int) clustered by (a) into 2 buckets stored as orc TBLPROPERTIES('transactional'='false')
          insert into T(a,b) values(1,2)
          insert into T(a,b) values(1,3)
          alter table T SET TBLPROPERTIES ('transactional'='true')
          {noformat}

              //we should now have bucket files 000001_0 and 000001_0_copy_1

          but OrcRawRecordMerger.OriginalReaderPair.next() doesn't know that there can be copy_N files and numbers rows in each bucket from 0 thus generating duplicate IDs

          {noformat}
          select ROW__ID, INPUT__FILE__NAME, a, b from T
          {noformat}

          produces
          {noformat}
          {"transactionid":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0,1,2
          {"transactionid\":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0_copy_1,1,3
          {noformat}

          [~owen.omalley], do you have any thoughts on a good way to handle this?

          attached patch has a few changes to make Acid even recognize copy_N but this is just a pre-requisite. The new UT demonstrates the issue.
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.02.patch [ 12857407 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.02.patch [ 12857410 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.02.patch [ 12857407 ]
          ekoifman Eugene Koifman made changes -
          Assignee Eugene Koifman [ ekoifman ]

          Bucket handling in Hive in general is completely screwed, and inconsistent in different places (e.g. sample and IIRC some other code would just take files in order, regardless of names, and if there are fewer or more files than needed).

          Maybe there needs to be some work to enforce it better via some cental utility or manager class that would get all files for a bucket and validate buckets more strictly.

          sershe Sergey Shelukhin added a comment - Bucket handling in Hive in general is completely screwed, and inconsistent in different places (e.g. sample and IIRC some other code would just take files in order, regardless of names, and if there are fewer or more files than needed). Maybe there needs to be some work to enforce it better via some cental utility or manager class that would get all files for a bucket and validate buckets more strictly.
          ekoifman Eugene Koifman added a comment - - edited

          We currently only allow converting a bucketed ORC table to Acid.
          Some possibilities:
          Check how splits are done for "isOriginal" files. If RecordReader.getRowNumber() is not smart enough to produce an ordinal from the beginning of the file, then we must be creating 1 split per bucket.
          If getRowNumber() is smart enough to produce a number from the beginning of the file, we may be splitting each file.

          Either way, OriginalReaderPair could look at which copy_N it has and look for all copy_M files with M<N and get number of rows in each. Sum up these row counts and use that as a starting point from which to number rows in copy_N.

          Alternatively, we can make each split include all files in a bucket in order and just keep numbering the rows.

          Having a smart getRowNumber() would be better since it allows splitting the table into many pieces. Otherwise the read parallelism is limited to the number of buckets. So for a large pre-acid table this may seem like a big drop in performance once it's converted to acid but before 1st major compaction

          Another possibility is to assign a different transaction ID to each copy_N file - we'd have to use a < 0 number - maybe the simplest fix if it works

          Another though: pick some number X (eg 10^12)
          For each copy_N, start numbering rows from N*X. This allows us to support up to approximately 10^12 rows per file, and up to 10^6 files in each bucket. OriginalReaderPair can throw if it ever sees a rowcount > X in any file. X can be configurable. This works if we indeed send all files for a given bucket to a single split.

          ekoifman Eugene Koifman added a comment - - edited We currently only allow converting a bucketed ORC table to Acid. Some possibilities: Check how splits are done for "isOriginal" files. If RecordReader.getRowNumber() is not smart enough to produce an ordinal from the beginning of the file, then we must be creating 1 split per bucket. If getRowNumber() is smart enough to produce a number from the beginning of the file, we may be splitting each file. Either way, OriginalReaderPair could look at which copy_N it has and look for all copy_M files with M<N and get number of rows in each. Sum up these row counts and use that as a starting point from which to number rows in copy_N. Alternatively, we can make each split include all files in a bucket in order and just keep numbering the rows. Having a smart getRowNumber() would be better since it allows splitting the table into many pieces. Otherwise the read parallelism is limited to the number of buckets. So for a large pre-acid table this may seem like a big drop in performance once it's converted to acid but before 1st major compaction Another possibility is to assign a different transaction ID to each copy_N file - we'd have to use a < 0 number - maybe the simplest fix if it works Another though: pick some number X (eg 10^12) For each copy_N, start numbering rows from N*X. This allows us to support up to approximately 10^12 rows per file, and up to 10^6 files in each bucket. OriginalReaderPair can throw if it ever sees a rowcount > X in any file. X can be configurable. This works if we indeed send all files for a given bucket to a single split.
          ekoifman Eugene Koifman made changes -
          Link This issue relates to HIVE-14366 [ HIVE-14366 ]
          ekoifman Eugene Koifman made changes -
          Link This issue relates to HIVE-13961 [ HIVE-13961 ]
          ekoifman Eugene Koifman made changes -
          Link This issue relates to HIVE-12724 [ HIVE-12724 ]
          ekoifman Eugene Koifman made changes -
          Priority Critical [ 2 ] Blocker [ 1 ]
          ekoifman Eugene Koifman made changes -
          Affects Version/s 0.14.0 [ 12326450 ]
          ekoifman Eugene Koifman made changes -
          Description {noformat}
          create table T(a int, b int) clustered by (a) into 2 buckets stored as orc TBLPROPERTIES('transactional'='false')
          insert into T(a,b) values(1,2)
          insert into T(a,b) values(1,3)
          alter table T SET TBLPROPERTIES ('transactional'='true')
          {noformat}

              //we should now have bucket files 000001_0 and 000001_0_copy_1

          but OrcRawRecordMerger.OriginalReaderPair.next() doesn't know that there can be copy_N files and numbers rows in each bucket from 0 thus generating duplicate IDs

          {noformat}
          select ROW__ID, INPUT__FILE__NAME, a, b from T
          {noformat}

          produces
          {noformat}
          {"transactionid":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0,1,2
          {"transactionid\":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0_copy_1,1,3
          {noformat}

          [~owen.omalley], do you have any thoughts on a good way to handle this?

          attached patch has a few changes to make Acid even recognize copy_N but this is just a pre-requisite. The new UT demonstrates the issue.
          {noformat}
          create table T(a int, b int) clustered by (a) into 2 buckets stored as orc TBLPROPERTIES('transactional'='false')
          insert into T(a,b) values(1,2)
          insert into T(a,b) values(1,3)
          alter table T SET TBLPROPERTIES ('transactional'='true')
          {noformat}

              //we should now have bucket files 000001_0 and 000001_0_copy_1

          but OrcRawRecordMerger.OriginalReaderPair.next() doesn't know that there can be copy_N files and numbers rows in each bucket from 0 thus generating duplicate IDs

          {noformat}
          select ROW__ID, INPUT__FILE__NAME, a, b from T
          {noformat}

          produces
          {noformat}
          {"transactionid":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0,1,2
          {"transactionid\":0,"bucketid":1,"rowid":0},file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands.../warehouse/nonacidorctbl/000001_0_copy_1,1,3
          {noformat}

          [~owen.omalley], do you have any thoughts on a good way to handle this?

          attached patch has a few changes to make Acid even recognize copy_N but this is just a pre-requisite. The new UT demonstrates the issue.


          Futhermore,
          {noformat}
          alter table T compact 'major'
          select ROW__ID, INPUT__FILE__NAME, a, b from T order by b
          {noformat}
          produces
          {noformat}
          {"transactionid":0,"bucketid":1,"rowid":0} file:/Users/ekoifman/dev/hiverwgit/ql/target/tmp/org.apache.hadoop.hive.ql.TestTxnCommands....warehouse/nonacidorctbl/base_-9223372036854775808/bucket_00001 1 2
          {noformat}

          HIVE-16177.04.patch has TestTxnCommands.testNonAcidToAcidConversion0() demonstrating this

          This is because compactor doesn't handle copy_N files either (skips them)
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.04.patch [ 12858916 ]
          ekoifman Eugene Koifman made changes -
          Link This issue is broken by HIVE-15199 [ HIVE-15199 ]
          ekoifman Eugene Koifman added a comment -

          looks like HIVE-15199 introduced copy_N files

          ekoifman Eugene Koifman added a comment - looks like HIVE-15199 introduced copy_N files

          I think it existed earlier than that... see the first comment on that jira

          sershe Sergey Shelukhin added a comment - I think it existed earlier than that... see the first comment on that jira
          ekoifman Eugene Koifman added a comment - - edited yes, you're right: https://github.com/apache/hive/blob/branch-0.8/ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java#L1813 (and maybe earlier)
          ekoifman Eugene Koifman made changes -
          Link This issue is broken by HIVE-15199 [ HIVE-15199 ]
          ekoifman Eugene Koifman added a comment - - edited

          For Compactor we should create a org.apache.hadoop.hive.ql.io.orc.Reader can can wrap individual Readers for all copy_N files for a given bucket and return rows in order.

          Also, the AcidInputFormat should throw if it finds directory layout it doesn't understand. This should never happen for data written after the table is made acid (Tez + CTAS + Union ?) but can happen for non-acid tables converted to acid (before major compaction):

          [5:17 PM] Sergey Shelukhin: 1) list bucketing
          [5:18 PM] Sergey Shelukhin: 2) any time, if the MR recursive-whatever setting is enabled
          [5:18 PM] Sergey Shelukhin: 3) iirc Hive can produce it sometimes from unions but I'm not sure

          Probably alter table T SET TBLPROPERTIES ('transactional'='true') should do some check the table to make sure it has directory structure Acid can handle and fail if not. This may be expensive for a table with lots of partitions.

          TezComplier.java has

          // We require the use of recursive input dirs for union processing
              conf.setBoolean("mapred.input.dir.recursive", true);
          
          ekoifman Eugene Koifman added a comment - - edited For Compactor we should create a org.apache.hadoop.hive.ql.io.orc.Reader can can wrap individual Readers for all copy_N files for a given bucket and return rows in order. Also, the AcidInputFormat should throw if it finds directory layout it doesn't understand. This should never happen for data written after the table is made acid (Tez + CTAS + Union ?) but can happen for non-acid tables converted to acid (before major compaction): [5:17 PM] Sergey Shelukhin: 1) list bucketing [5:18 PM] Sergey Shelukhin: 2) any time, if the MR recursive-whatever setting is enabled [5:18 PM] Sergey Shelukhin: 3) iirc Hive can produce it sometimes from unions but I'm not sure Probably alter table T SET TBLPROPERTIES ('transactional'='true') should do some check the table to make sure it has directory structure Acid can handle and fail if not. This may be expensive for a table with lots of partitions. TezComplier.java has // We require the use of recursive input dirs for union processing conf.setBoolean("mapred.input.dir.recursive", true);
          ekoifman Eugene Koifman added a comment - - edited

          TODO: see that TransactionalValidationListener does the right thing - for example, list bucketing should be visible in metadata
          https://cwiki.apache.org/confluence/display/Hive/ListBucketing

          ekoifman Eugene Koifman added a comment - - edited TODO: see that TransactionalValidationListener does the right thing - for example, list bucketing should be visible in metadata https://cwiki.apache.org/confluence/display/Hive/ListBucketing
          ekoifman Eugene Koifman made changes -
          Link This issue relates to HIVE-11525 [ HIVE-11525 ]
          ekoifman Eugene Koifman made changes -
          Link This issue relates to HIVE-15899 [ HIVE-15899 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.07.patch [ 12868598 ]
          ekoifman Eugene Koifman made changes -
          Status Open [ 1 ] Patch Available [ 10002 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12868598/HIVE-16177.07.patch

          SUCCESS: +1 due to 6 test(s) being added or modified.

          ERROR: -1 due to 12 failed/errored test(s), 10744 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_orig_table] (batchId=58)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table] (batchId=54)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=60)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[merge4] (batchId=12)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[table_nonprintable] (batchId=140)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_orig_table] (batchId=155)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=144)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_3] (batchId=97)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3] (batchId=97)
          org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion0 (batchId=278)
          org.apache.hadoop.hive.ql.TestTxnCommands2.testMultiInsert1 (batchId=267)
          org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdate.testMultiInsert2 (batchId=277)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5313/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5313/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5313/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 12 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12868598 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12868598/HIVE-16177.07.patch SUCCESS: +1 due to 6 test(s) being added or modified. ERROR: -1 due to 12 failed/errored test(s), 10744 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_orig_table] (batchId=58) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table] (batchId=54) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=60) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[merge4] (batchId=12) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[table_nonprintable] (batchId=140) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_orig_table] (batchId=155) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=144) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_3] (batchId=97) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3] (batchId=97) org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion0 (batchId=278) org.apache.hadoop.hive.ql.TestTxnCommands2.testMultiInsert1 (batchId=267) org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdate.testMultiInsert2 (batchId=277) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5313/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5313/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5313/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 12 tests failed This message is automatically generated. ATTACHMENT ID: 12868598 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.08.patch [ 12868654 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12868654/HIVE-16177.08.patch

          SUCCESS: +1 due to 6 test(s) being added or modified.

          ERROR: -1 due to 12 failed/errored test(s), 10744 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_orig_table] (batchId=58)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table] (batchId=54)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=60)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[columnstats_part_coltype] (batchId=156)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_orig_table] (batchId=155)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=144)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_join30] (batchId=149)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=98)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_3] (batchId=97)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3] (batchId=97)
          org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion0 (batchId=278)
          org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdateAndVectorization.testMultiInsert1 (batchId=275)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5321/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5321/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5321/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 12 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12868654 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12868654/HIVE-16177.08.patch SUCCESS: +1 due to 6 test(s) being added or modified. ERROR: -1 due to 12 failed/errored test(s), 10744 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_orig_table] (batchId=58) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table] (batchId=54) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=60) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[columnstats_part_coltype] (batchId=156) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_orig_table] (batchId=155) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=144) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_join30] (batchId=149) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=98) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_3] (batchId=97) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainuser_3] (batchId=97) org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion0 (batchId=278) org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdateAndVectorization.testMultiInsert1 (batchId=275) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5321/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5321/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5321/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 12 tests failed This message is automatically generated. ATTACHMENT ID: 12868654 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.09.patch [ 12868978 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.10.patch [ 12869080 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12869080/HIVE-16177.10.patch

          ERROR: -1 due to build exiting with an error

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5364/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5364/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5364/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Tests exited with: NonZeroExitCodeException
          Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N'
          2017-05-20 03:19:05.518
          + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
          + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
          + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
          + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
          + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
          + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
          + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
          + export 'MAVEN_OPTS=-Xmx1g '
          + MAVEN_OPTS='-Xmx1g '
          + cd /data/hiveptest/working/
          + tee /data/hiveptest/logs/PreCommit-HIVE-Build-5364/source-prep.txt
          + [[ false == \t\r\u\e ]]
          + mkdir -p maven ivy
          + [[ git = \s\v\n ]]
          + [[ git = \g\i\t ]]
          + [[ -z master ]]
          + [[ -d apache-github-source-source ]]
          + [[ ! -d apache-github-source-source/.git ]]
          + [[ ! -d apache-github-source-source ]]
          + date '+%Y-%m-%d %T.%3N'
          2017-05-20 03:19:05.521
          + cd apache-github-source-source
          + git fetch origin
          + git reset --hard HEAD
          HEAD is now at bb2f25c HIVE-16724 : increase session timeout for LLAP ZK token manager (Sergey Shelukhin, reviewed by Jason Dere)
          + git clean -f -d
          Removing metastore/src/java/org/apache/hadoop/hive/metastore/IMetaStoreSchemaInfo.java
          Removing metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreSchemaInfoFactory.java
          Removing metastore/src/test/org/apache/hadoop/hive/metastore/TestMetaStoreSchemaFactory.java
          + git checkout master
          Already on 'master'
          Your branch is up-to-date with 'origin/master'.
          + git reset --hard origin/master
          HEAD is now at bb2f25c HIVE-16724 : increase session timeout for LLAP ZK token manager (Sergey Shelukhin, reviewed by Jason Dere)
          + git merge --ff-only origin/master
          Already up-to-date.
          + date '+%Y-%m-%d %T.%3N'
          2017-05-20 03:19:06.351
          + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
          + patchFilePath=/data/hiveptest/working/scratch/build.patch
          + [[ -f /data/hiveptest/working/scratch/build.patch ]]
          + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
          + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch
          Going to apply patch with: patch -p0
          patching file itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/TestAcidOnTez.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidInputFormat.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidOutputFormat.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/ExternalCache.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcRawRecordMerger.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcAcidRowBatchReader.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/TestAcidUtils.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestInputOutputFormat.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestOrcRawRecordMerger.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedOrcAcidRowBatchReader.java
          patching file shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java
          patching file shims/common/src/main/java/org/apache/hadoop/hive/shims/HadoopShims.java
          + [[ maven == \m\a\v\e\n ]]
          + rm -rf /data/hiveptest/working/maven/org/apache/hive
          + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven
          ANTLR Parser Generator  Version 3.5.2
          Output file /data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g
          org/apache/hadoop/hive/metastore/parser/Filter.g
          DataNucleus Enhancer (version 4.1.17) for API "JDO"
          DataNucleus Enhancer : Classpath
          >>  /usr/share/maven/boot/plexus-classworlds-2.x.jar
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDatabase
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MType
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTable
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MConstraint
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MOrder
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStringList
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartition
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MIndex
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRole
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRoleMap
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMasterKey
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MVersionTable
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMetastoreDBProperties
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MResourceUri
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFunction
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationLog
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationNextId
          DataNucleus Enhancer completed with success for 31 classes. Timings : input=154 ms, enhance=213 ms, total=367 ms. Consult the log for full details
          ANTLR Parser Generator  Version 3.5.2
          Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
          org/apache/hadoop/hive/ql/parse/HiveLexer.g
          Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
          org/apache/hadoop/hive/ql/parse/HiveParser.g
          Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
          org/apache/hadoop/hive/ql/parse/HintParser.g
          Generating vector expression code
          Generating vector expression test code
          [ERROR] COMPILATION ERROR : 
          [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-exec: Compilation failure
          [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-exec
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12869080 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12869080/HIVE-16177.10.patch ERROR: -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5364/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5364/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5364/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-05-20 03:19:05.518 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-5364/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-05-20 03:19:05.521 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at bb2f25c HIVE-16724 : increase session timeout for LLAP ZK token manager (Sergey Shelukhin, reviewed by Jason Dere) + git clean -f -d Removing metastore/src/java/org/apache/hadoop/hive/metastore/IMetaStoreSchemaInfo.java Removing metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreSchemaInfoFactory.java Removing metastore/src/test/org/apache/hadoop/hive/metastore/TestMetaStoreSchemaFactory.java + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at bb2f25c HIVE-16724 : increase session timeout for LLAP ZK token manager (Sergey Shelukhin, reviewed by Jason Dere) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-05-20 03:19:06.351 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/TestAcidOnTez.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidInputFormat.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidOutputFormat.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/ExternalCache.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcRawRecordMerger.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcAcidRowBatchReader.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java patching file ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/TestAcidUtils.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestInputOutputFormat.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestOrcRawRecordMerger.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedOrcAcidRowBatchReader.java patching file shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java patching file shims/common/src/main/java/org/apache/hadoop/hive/shims/HadoopShims.java + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g DataNucleus Enhancer (version 4.1.17) for API "JDO" DataNucleus Enhancer : Classpath >> /usr/share/maven/boot/plexus-classworlds-2.x.jar ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MConstraint ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MVersionTable ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMetastoreDBProperties ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MResourceUri ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFunction ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationLog ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationNextId DataNucleus Enhancer completed with success for 31 classes. Timings : input=154 ms, enhance=213 ms, total=367 ms. Consult the log for full details ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveLexer.g Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g org/apache/hadoop/hive/ql/parse/HiveParser.g Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g org/apache/hadoop/hive/ql/parse/HintParser.g Generating vector expression code Generating vector expression test code [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-exec: Compilation failure [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12869080 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.11.patch [ 12869126 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12869126/HIVE-16177.11.patch

          ERROR: -1 due to build exiting with an error

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5369/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5369/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5369/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Tests exited with: NonZeroExitCodeException
          Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N'
          2017-05-20 16:43:10.293
          + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
          + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
          + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
          + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
          + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
          + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
          + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
          + export 'MAVEN_OPTS=-Xmx1g '
          + MAVEN_OPTS='-Xmx1g '
          + cd /data/hiveptest/working/
          + tee /data/hiveptest/logs/PreCommit-HIVE-Build-5369/source-prep.txt
          + [[ false == \t\r\u\e ]]
          + mkdir -p maven ivy
          + [[ git = \s\v\n ]]
          + [[ git = \g\i\t ]]
          + [[ -z master ]]
          + [[ -d apache-github-source-source ]]
          + [[ ! -d apache-github-source-source/.git ]]
          + [[ ! -d apache-github-source-source ]]
          + date '+%Y-%m-%d %T.%3N'
          2017-05-20 16:43:10.296
          + cd apache-github-source-source
          + git fetch origin
          + git reset --hard HEAD
          HEAD is now at 7429f5f HIVE-16717: Extend shared scan optimizer to handle partitions (Jesus Camacho Rodriguez, reviewed by Ashutosh Chauhan)
          + git clean -f -d
          Removing ql/src/test/queries/clientpositive/partition_pruning.q
          Removing ql/src/test/results/clientpositive/llap/partition_pruning.q.out
          Removing ql/src/test/results/clientpositive/partition_pruning.q.out
          + git checkout master
          Already on 'master'
          Your branch is up-to-date with 'origin/master'.
          + git reset --hard origin/master
          HEAD is now at 7429f5f HIVE-16717: Extend shared scan optimizer to handle partitions (Jesus Camacho Rodriguez, reviewed by Ashutosh Chauhan)
          + git merge --ff-only origin/master
          Already up-to-date.
          + date '+%Y-%m-%d %T.%3N'
          2017-05-20 16:43:11.077
          + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
          + patchFilePath=/data/hiveptest/working/scratch/build.patch
          + [[ -f /data/hiveptest/working/scratch/build.patch ]]
          + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
          + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch
          Going to apply patch with: patch -p0
          patching file itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/TestAcidOnTez.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidInputFormat.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidOutputFormat.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/ExternalCache.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcRawRecordMerger.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcAcidRowBatchReader.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java
          patching file ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/TestAcidUtils.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestInputOutputFormat.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestOrcRawRecordMerger.java
          patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedOrcAcidRowBatchReader.java
          patching file shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java
          patching file shims/common/src/main/java/org/apache/hadoop/hive/shims/HadoopShims.java
          + [[ maven == \m\a\v\e\n ]]
          + rm -rf /data/hiveptest/working/maven/org/apache/hive
          + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven
          ANTLR Parser Generator  Version 3.5.2
          Output file /data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g
          org/apache/hadoop/hive/metastore/parser/Filter.g
          DataNucleus Enhancer (version 4.1.17) for API "JDO"
          DataNucleus Enhancer : Classpath
          >>  /usr/share/maven/boot/plexus-classworlds-2.x.jar
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDatabase
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MType
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTable
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MConstraint
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MOrder
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStringList
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartition
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MIndex
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRole
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRoleMap
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMasterKey
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MVersionTable
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMetastoreDBProperties
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MResourceUri
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFunction
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationLog
          ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationNextId
          DataNucleus Enhancer completed with success for 31 classes. Timings : input=169 ms, enhance=167 ms, total=336 ms. Consult the log for full details
          ANTLR Parser Generator  Version 3.5.2
          Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
          org/apache/hadoop/hive/ql/parse/HiveLexer.g
          Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
          org/apache/hadoop/hive/ql/parse/HiveParser.g
          Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
          org/apache/hadoop/hive/ql/parse/HintParser.g
          Generating vector expression code
          Generating vector expression test code
          [ERROR] COMPILATION ERROR : 
          [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-exec: Compilation failure
          [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-exec
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12869126 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12869126/HIVE-16177.11.patch ERROR: -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5369/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5369/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5369/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-05-20 16:43:10.293 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-5369/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-05-20 16:43:10.296 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 7429f5f HIVE-16717: Extend shared scan optimizer to handle partitions (Jesus Camacho Rodriguez, reviewed by Ashutosh Chauhan) + git clean -f -d Removing ql/src/test/queries/clientpositive/partition_pruning.q Removing ql/src/test/results/clientpositive/llap/partition_pruning.q.out Removing ql/src/test/results/clientpositive/partition_pruning.q.out + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 7429f5f HIVE-16717: Extend shared scan optimizer to handle partitions (Jesus Camacho Rodriguez, reviewed by Ashutosh Chauhan) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-05-20 16:43:11.077 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/TestAcidOnTez.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidInputFormat.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidOutputFormat.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/ExternalCache.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcRawRecordMerger.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcAcidRowBatchReader.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java patching file ql/src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java patching file ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/TestAcidUtils.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestInputOutputFormat.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestOrcRawRecordMerger.java patching file ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedOrcAcidRowBatchReader.java patching file shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java patching file shims/common/src/main/java/org/apache/hadoop/hive/shims/HadoopShims.java + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g DataNucleus Enhancer (version 4.1.17) for API "JDO" DataNucleus Enhancer : Classpath >> /usr/share/maven/boot/plexus-classworlds-2.x.jar ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MConstraint ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MVersionTable ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MMetastoreDBProperties ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MResourceUri ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MFunction ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationLog ENHANCED (Persistable) : org.apache.hadoop.hive.metastore.model.MNotificationNextId DataNucleus Enhancer completed with success for 31 classes. Timings : input=169 ms, enhance=167 ms, total=336 ms. Consult the log for full details ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveLexer.g Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g org/apache/hadoop/hive/ql/parse/HiveParser.g Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g org/apache/hadoop/hive/ql/parse/HintParser.g Generating vector expression code Generating vector expression test code [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-exec: Compilation failure [ERROR] /data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands.java:[20,28] package javafx.scene.control does not exist [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12869126 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.14.patch [ 12869322 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12869322/HIVE-16177.14.patch

          SUCCESS: +1 due to 6 test(s) being added or modified.

          ERROR: -1 due to 5 failed/errored test(s), 10746 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[materialized_view_create_rewrite] (batchId=236)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_orig_table] (batchId=58)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table] (batchId=54)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=60)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_orig_table] (batchId=155)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5385/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5385/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5385/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 5 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12869322 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12869322/HIVE-16177.14.patch SUCCESS: +1 due to 6 test(s) being added or modified. ERROR: -1 due to 5 failed/errored test(s), 10746 tests executed Failed tests: org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[materialized_view_create_rewrite] (batchId=236) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_orig_table] (batchId=58) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table] (batchId=54) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=60) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[insert_orig_table] (batchId=155) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5385/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5385/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5385/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 5 tests failed This message is automatically generated. ATTACHMENT ID: 12869322 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Link This issue is related to HIVE-16732 [ HIVE-16732 ]
          ekoifman Eugene Koifman added a comment -

          patch 15 - adjusted some tests that expose HIVE-16732 through an assert in this ticket

          ekoifman Eugene Koifman added a comment - patch 15 - adjusted some tests that expose HIVE-16732 through an assert in this ticket
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.15.patch [ 12869348 ]
          ekoifman Eugene Koifman added a comment -

          [~owen.omalley], could you review patch 15 please

          ekoifman Eugene Koifman added a comment - [~owen.omalley] , could you review patch 15 please
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12869348/HIVE-16177.15.patch

          SUCCESS: +1 due to 9 test(s) being added or modified.

          ERROR: -1 due to 1 failed/errored test(s), 10749 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=144)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5390/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5390/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5390/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 1 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12869348 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12869348/HIVE-16177.15.patch SUCCESS: +1 due to 9 test(s) being added or modified. ERROR: -1 due to 1 failed/errored test(s), 10749 tests executed Failed tests: org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=144) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5390/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5390/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5390/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12869348 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman added a comment -

          patch 15 - failure not related

          ekoifman Eugene Koifman added a comment - patch 15 - failure not related
          ekoifman Eugene Koifman made changes -
          Remote Link This issue links to "Review Board (Web Link)" [ 83283 ]
          omalley Owen O'Malley added a comment -
          • You still have some trailing space issues.
          • Why are you sorting the files?
          • Your comment on HadoopShims.HdfsFileStatusWithId.compareTo() is pretty confusing. You need more context on why you are sorting them and need a particular sort order. Since there is only one implementation in the shims, it doesn't seem like it is appropriate in shims. I'd suggest making a comparator in AcidUtils.
          • Why is the totalSize going down so much in the test results?

          I'm still going through the record merger change.

          omalley Owen O'Malley added a comment - You still have some trailing space issues. Why are you sorting the files? Your comment on HadoopShims.HdfsFileStatusWithId.compareTo() is pretty confusing. You need more context on why you are sorting them and need a particular sort order. Since there is only one implementation in the shims, it doesn't seem like it is appropriate in shims. I'd suggest making a comparator in AcidUtils. Why is the totalSize going down so much in the test results? I'm still going through the record merger change.
          ekoifman Eugene Koifman added a comment - - edited

          The file list is sorted to make sure there is consistent ordering for both read and compact.
          Compaction needs to process the whole list of files (for a bucket) and assign ROW_IDs consistently.
          For read, OrcRawRecordReader just has a split from some file. So I need to make sure order them the same way so that the "offset" for the current file is computed the same way as for compaction.

          Since Hive doesn't restrict the layout of files in a table very well, sorting is the most general way to do this.
          For example, say we realize that some "feature" places bucket files in subdirectories - by sorting the whole list of "original" files it makes this work with any directory layout.

          Same goes for when we allow non-bucketed tables - files can be anywhere and they need to be "numbered" consistently. Sorting seems like the simplest way to do this.

          Putting a Comparator in AcidUtils makes sense.

          "totalSize" is probably because I run the tests on Mac. Stats often differ on Mac.

          ekoifman Eugene Koifman added a comment - - edited The file list is sorted to make sure there is consistent ordering for both read and compact. Compaction needs to process the whole list of files (for a bucket) and assign ROW_IDs consistently. For read, OrcRawRecordReader just has a split from some file. So I need to make sure order them the same way so that the "offset" for the current file is computed the same way as for compaction. Since Hive doesn't restrict the layout of files in a table very well, sorting is the most general way to do this. For example, say we realize that some "feature" places bucket files in subdirectories - by sorting the whole list of "original" files it makes this work with any directory layout. Same goes for when we allow non-bucketed tables - files can be anywhere and they need to be "numbered" consistently. Sorting seems like the simplest way to do this. Putting a Comparator in AcidUtils makes sense. "totalSize" is probably because I run the tests on Mac. Stats often differ on Mac.
          ekoifman Eugene Koifman added a comment -

          patch 16 addresses RB comments

          ekoifman Eugene Koifman added a comment - patch 16 addresses RB comments
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.16.patch [ 12874345 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12874345/HIVE-16177.16.patch

          SUCCESS: +1 due to 9 test(s) being added or modified.

          ERROR: -1 due to 16 failed/errored test(s), 10849 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[insert_overwrite_local_directory_1] (batchId=238)
          org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[rcfile_buckets] (batchId=241)
          org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[zero_rows_blobstore] (batchId=241)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_main] (batchId=150)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=146)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=99)
          org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query16] (batchId=233)
          org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=233)
          org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query94] (batchId=233)
          org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[union24] (batchId=125)
          org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testBootstrapFunctionReplication (batchId=217)
          org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testCreateFunctionIncrementalReplication (batchId=217)
          org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testCreateFunctionWithFunctionBinaryJarsOnHDFS (batchId=217)
          org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=178)
          org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=178)
          org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=178)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5762/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5762/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5762/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 16 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12874345 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12874345/HIVE-16177.16.patch SUCCESS: +1 due to 9 test(s) being added or modified. ERROR: -1 due to 16 failed/errored test(s), 10849 tests executed Failed tests: org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[insert_overwrite_local_directory_1] (batchId=238) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[rcfile_buckets] (batchId=241) org.apache.hadoop.hive.cli.TestBlobstoreCliDriver.testCliDriver[zero_rows_blobstore] (batchId=241) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_main] (batchId=150) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=146) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=99) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query16] (batchId=233) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=233) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query94] (batchId=233) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[union24] (batchId=125) org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testBootstrapFunctionReplication (batchId=217) org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testCreateFunctionIncrementalReplication (batchId=217) org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testCreateFunctionWithFunctionBinaryJarsOnHDFS (batchId=217) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=178) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=178) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=178) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5762/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5762/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5762/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 16 tests failed This message is automatically generated. ATTACHMENT ID: 12874345 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Link This issue requires HIVE-16964 [ HIVE-16964 ]

          +1 from me, some nits about comments can be fixed on commit. Pending [~owen.omalley] 's feedback if any is left

          sershe Sergey Shelukhin added a comment - +1 from me, some nits about comments can be fixed on commit. Pending [~owen.omalley] 's feedback if any is left
          ekoifman Eugene Koifman made changes -
          Link This issue blocks HIVE-17069 [ HIVE-17069 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.17.patch [ 12876519 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.17.patch [ 12876519 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.17.patch [ 12876521 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.18.patch [ 12876537 ]
          ekoifman Eugene Koifman added a comment -

          patch 18 (vs 16) adds a bunch of clarifying comments and contains very minor code changes: 1. creates a Comparator in AcidUtils to sort "original" files per Owen's suggestion and makes "isLastFileForThisBucket" in OrcRawRecordMerger.OriginalReaderPair() make more sense.

          ekoifman Eugene Koifman added a comment - patch 18 (vs 16) adds a bunch of clarifying comments and contains very minor code changes: 1. creates a Comparator in AcidUtils to sort "original" files per Owen's suggestion and makes "isLastFileForThisBucket" in OrcRawRecordMerger.OriginalReaderPair() make more sense.
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12876521/HIVE-16177.17.patch

          SUCCESS: +1 due to 9 test(s) being added or modified.

          ERROR: -1 due to 11 failed/errored test(s), 10838 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[insert_overwrite_local_directory_1] (batchId=237)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=143)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=145)
          org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=232)
          org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion01 (batchId=282)
          org.apache.hadoop.hive.ql.TestTxnCommands2.testNonAcidToAcidConversion02 (batchId=269)
          org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdate.testNonAcidToAcidConversion02 (batchId=280)
          org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdateAndVectorization.testNonAcidToAcidConversion02 (batchId=277)
          org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=177)
          org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=177)
          org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=177)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5943/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5943/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5943/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 11 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12876521 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12876521/HIVE-16177.17.patch SUCCESS: +1 due to 9 test(s) being added or modified. ERROR: -1 due to 11 failed/errored test(s), 10838 tests executed Failed tests: org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[insert_overwrite_local_directory_1] (batchId=237) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=143) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=145) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=232) org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion01 (batchId=282) org.apache.hadoop.hive.ql.TestTxnCommands2.testNonAcidToAcidConversion02 (batchId=269) org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdate.testNonAcidToAcidConversion02 (batchId=280) org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdateAndVectorization.testNonAcidToAcidConversion02 (batchId=277) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=177) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=177) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=177) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5943/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5943/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5943/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 11 tests failed This message is automatically generated. ATTACHMENT ID: 12876521 - PreCommit-HIVE-Build
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12876537/HIVE-16177.18.patch

          SUCCESS: +1 due to 9 test(s) being added or modified.

          ERROR: -1 due to 9 failed/errored test(s), 10838 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[create_merge_compressed] (batchId=237)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[partition_wise_fileformat6] (batchId=7)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=143)
          org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=145)
          org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=99)
          org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=232)
          org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=177)
          org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=177)
          org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=177)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5948/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5948/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5948/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 9 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12876537 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12876537/HIVE-16177.18.patch SUCCESS: +1 due to 9 test(s) being added or modified. ERROR: -1 due to 9 failed/errored test(s), 10838 tests executed Failed tests: org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[create_merge_compressed] (batchId=237) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[partition_wise_fileformat6] (batchId=7) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=143) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=145) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=99) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query23] (batchId=232) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=177) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=177) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=177) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5948/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5948/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5948/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 9 tests failed This message is automatically generated. ATTACHMENT ID: 12876537 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman added a comment -

          no related failures
          committed patch 18 to master (3.0)
          thanks Sergey, Owen for the review

          ekoifman Eugene Koifman added a comment - no related failures committed patch 18 to master (3.0) thanks Sergey, Owen for the review
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.18-branch-2.patch [ 12876662 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12876662/HIVE-16177.18-branch-2.patch

          SUCCESS: +1 due to 9 test(s) being added or modified.

          ERROR: -1 due to 14 failed/errored test(s), 10584 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[comments] (batchId=35)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=38)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=59)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=142)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_basic] (batchId=139)
          org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=115)
          org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vectorized_ptf] (batchId=125)
          org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion01 (batchId=278)
          org.apache.hadoop.hive.ql.TestTxnCommands2.testNonAcidToAcidConversion02 (batchId=266)
          org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdate.testNonAcidToAcidConversion02 (batchId=276)
          org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdateAndVectorization.testNonAcidToAcidConversion02 (batchId=273)
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testPartition (batchId=228)
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testPartition (batchId=217)
          org.apache.hive.hcatalog.api.TestHCatClient.testTransportFailure (batchId=176)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5962/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5962/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5962/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 14 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12876662 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12876662/HIVE-16177.18-branch-2.patch SUCCESS: +1 due to 9 test(s) being added or modified. ERROR: -1 due to 14 failed/errored test(s), 10584 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[comments] (batchId=35) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=38) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=59) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=142) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_basic] (batchId=139) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=115) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vectorized_ptf] (batchId=125) org.apache.hadoop.hive.ql.TestTxnCommands.testNonAcidToAcidConversion01 (batchId=278) org.apache.hadoop.hive.ql.TestTxnCommands2.testNonAcidToAcidConversion02 (batchId=266) org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdate.testNonAcidToAcidConversion02 (batchId=276) org.apache.hadoop.hive.ql.TestTxnCommands2WithSplitUpdateAndVectorization.testNonAcidToAcidConversion02 (batchId=273) org.apache.hadoop.hive.ql.security.TestExtendedAcls.testPartition (batchId=228) org.apache.hadoop.hive.ql.security.TestFolderPermissions.testPartition (batchId=217) org.apache.hive.hcatalog.api.TestHCatClient.testTransportFailure (batchId=176) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5962/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5962/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5962/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 14 tests failed This message is automatically generated. ATTACHMENT ID: 12876662 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.19-branch-2.patch [ 12876695 ]
          ekoifman Eugene Koifman made changes -
          Attachment HIVE-16177.20-branch-2.patch [ 12876712 ]
          hiveqa Hive QA added a comment -

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12876712/HIVE-16177.20-branch-2.patch

          SUCCESS: +1 due to 9 test(s) being added or modified.

          ERROR: -1 due to 9 failed/errored test(s), 10584 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[comments] (batchId=35)
          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=38)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=142)
          org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_basic] (batchId=139)
          org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=115)
          org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vectorized_ptf] (batchId=125)
          org.apache.hadoop.hive.ql.security.TestExtendedAcls.testPartition (batchId=228)
          org.apache.hadoop.hive.ql.security.TestFolderPermissions.testPartition (batchId=217)
          org.apache.hive.hcatalog.api.TestHCatClient.testTransportFailure (batchId=176)
          

          Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5971/testReport
          Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5971/console
          Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5971/

          Messages:

          Executing org.apache.hive.ptest.execution.TestCheckPhase
          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 9 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12876712 - PreCommit-HIVE-Build

          hiveqa Hive QA added a comment - Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12876712/HIVE-16177.20-branch-2.patch SUCCESS: +1 due to 9 test(s) being added or modified. ERROR: -1 due to 9 failed/errored test(s), 10584 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[comments] (batchId=35) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=38) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[llap_smb] (batchId=142) org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[orc_ppd_basic] (batchId=139) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[explaindenpendencydiffengs] (batchId=115) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[vectorized_ptf] (batchId=125) org.apache.hadoop.hive.ql.security.TestExtendedAcls.testPartition (batchId=228) org.apache.hadoop.hive.ql.security.TestFolderPermissions.testPartition (batchId=217) org.apache.hive.hcatalog.api.TestHCatClient.testTransportFailure (batchId=176) Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5971/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5971/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5971/ Messages: Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 9 tests failed This message is automatically generated. ATTACHMENT ID: 12876712 - PreCommit-HIVE-Build
          ekoifman Eugene Koifman added a comment -

          HIVE-16177.20-branch-2.patch committed to branch-2 (2.x)

          ekoifman Eugene Koifman added a comment - HIVE-16177 .20-branch-2.patch committed to branch-2 (2.x)
          ekoifman Eugene Koifman made changes -
          Fix Version/s 3.0.0 [ 12340268 ]
          Fix Version/s 2.4.0 [ 12340338 ]
          Resolution Fixed [ 1 ]
          Status Patch Available [ 10002 ] Resolved [ 5 ]
          ekoifman Eugene Koifman made changes -
          Link This issue is related to HIVE-17526 [ HIVE-17526 ]
          vgarg Vineet Garg added a comment -

          Hive 3.0.0 has been released so closing this jira.

          vgarg Vineet Garg added a comment - Hive 3.0.0 has been released so closing this jira.
          vgarg Vineet Garg made changes -
          Status Resolved [ 5 ] Closed [ 6 ]
          dfoulks Drew Foulks made changes -
          Workflow no-reopen-closed, patch-avail [ 13265153 ] Hive - no-reopen-closed, patch-avail [ 14131622 ]

          People

            ekoifman Eugene Koifman Assign to me
            ekoifman Eugene Koifman
            Votes:
            0 Vote for this issue
            Watchers:
            Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack