Details

    • Type: Sub-task Sub-task
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.13.0
    • Component/s: None
    • Labels:
      None

      Description

      We need to define some new APIs for how Hive interacts with the file formats since it needs to be much richer than the current RecordReader and RecordWriter.

      1. acid-io.patch
        218 kB
        Owen O'Malley
      2. h-5317.patch
        21 kB
        Owen O'Malley
      3. h-5317.patch
        6 kB
        Owen O'Malley
      4. h-5317.patch
        3 kB
        Owen O'Malley
      5. h-6060.patch
        330 kB
        Owen O'Malley
      6. h-6060.patch
        23 kB
        Owen O'Malley
      7. HIVE-6060.patch
        372 kB
        Owen O'Malley
      8. HIVE-6060.patch
        368 kB
        Owen O'Malley
      9. HIVE-6060.patch
        340 kB
        Owen O'Malley
      10. HIVE-6060.patch
        340 kB
        Owen O'Malley
      11. HIVE-6060.patch
        339 kB
        Owen O'Malley
      12. HIVE-6060.patch
        339 kB
        Owen O'Malley
      13. HIVE-6060.patch
        330 kB
        Owen O'Malley

        Activity

        Hide
        Owen O'Malley added a comment -

        Here's a first cut of what the RecordUpdater looks like.

        Show
        Owen O'Malley added a comment - Here's a first cut of what the RecordUpdater looks like.
        Hide
        Owen O'Malley added a comment -

        Rowids are not unique across buckets and thus the unique identifier is thus: (transaction id, bucket id, row id). Alan suggested offline that I add bucket id to API so that we aren't forced to maintain the current restriction of one HDFS file per a bucket. I've also added my thoughts on what the reader would look like.

        I also need to look at what the API looks like for vectorization.

        Show
        Owen O'Malley added a comment - Rowids are not unique across buckets and thus the unique identifier is thus: (transaction id, bucket id, row id). Alan suggested offline that I add bucket id to API so that we aren't forced to maintain the current restriction of one HDFS file per a bucket. I've also added my thoughts on what the reader would look like. I also need to look at what the API looks like for vectorization.
        Hide
        Owen O'Malley added a comment -

        This patch updates the input/output apis:

        • Add AcidInputFormat and AcidOutputFormat
        • Add RecordIdentifier
        • Makes OrcInputFormat and OrcOutputFormat implement AcidInputFormat and AcidOutputFormat.
        • Adds some simple stubs to OrcOutputFormat so that I can see the calls to the RecordUpdater.
        Show
        Owen O'Malley added a comment - This patch updates the input/output apis: Add AcidInputFormat and AcidOutputFormat Add RecordIdentifier Makes OrcInputFormat and OrcOutputFormat implement AcidInputFormat and AcidOutputFormat. Adds some simple stubs to OrcOutputFormat so that I can see the calls to the RecordUpdater.
        Hide
        Owen O'Malley added a comment -

        More refinements.

        Show
        Owen O'Malley added a comment - More refinements.
        Hide
        Owen O'Malley added a comment -

        This is still a work in progress, but it shows the path:

        • Adds AcidInputFormat and AcidOutputFormat interfaces for input/output formats that can support the acid requirements.
        • Extends OrcInputFormat and OrcOutputFormat to implement the interfaces.
        • Adds AcidUtils that provides general routines to analyze the partition directory and figure out which base and deltas to use.
        • Doesn't change the behavior of the insert commands, which will still write the traditional hive file layout. This will change later when we add command support.
        • The input format in getsplits will detect whether there are new or old style layouts and read them appropriately.
        • Java clients can write the new layout by using the RecordUpdater interface.
        • There are raw interfaces for the compactor to use.
        Show
        Owen O'Malley added a comment - This is still a work in progress, but it shows the path: Adds AcidInputFormat and AcidOutputFormat interfaces for input/output formats that can support the acid requirements. Extends OrcInputFormat and OrcOutputFormat to implement the interfaces. Adds AcidUtils that provides general routines to analyze the partition directory and figure out which base and deltas to use. Doesn't change the behavior of the insert commands, which will still write the traditional hive file layout. This will change later when we add command support. The input format in getsplits will detect whether there are new or old style layouts and read them appropriately. Java clients can write the new layout by using the RecordUpdater interface. There are raw interfaces for the compactor to use.
        Hide
        Owen O'Malley added a comment -

        This patch puts everything together:

        • Defines AcidInputFormat and AcidOutputFormat.
        • Extends OrcInputFormat and OrcOutputFormat to implement them.
        • Creates AcidUtils to figure out which base and deltas need to be read.
        • Provides raw interfaces that the compactor uses to re-write small files.
        • Moves ValidTxnList and ValidTxnListImpl to common where they can be used by code in mapreduce tasks and the metastore.
        • Adds an interface to Orc Writers that provides callbacks when stripes are being written.
        • Adds a method to Orc Writers that allow the client to write the current stripe to disk and writes a temporary footer before the writer continues to write new stripes.
        Show
        Owen O'Malley added a comment - This patch puts everything together: Defines AcidInputFormat and AcidOutputFormat. Extends OrcInputFormat and OrcOutputFormat to implement them. Creates AcidUtils to figure out which base and deltas need to be read. Provides raw interfaces that the compactor uses to re-write small files. Moves ValidTxnList and ValidTxnListImpl to common where they can be used by code in mapreduce tasks and the metastore. Adds an interface to Orc Writers that provides callbacks when stripes are being written. Adds a method to Orc Writers that allow the client to write the current stripe to disk and writes a temporary footer before the writer continues to write new stripes.
        Hide
        Owen O'Malley added a comment -

        Re-uploading for jenkins.

        Show
        Owen O'Malley added a comment - Re-uploading for jenkins.
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12632847/HIVE-6060.patch

        ERROR: -1 due to 34 failed/errored test(s), 5383 tests executed
        Failed tests:

        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_char_serde
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_date_serde
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_04_evolved_parts
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_11_managed_external
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_12_external_location
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_13_managed_location
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_14_managed_location_over_existing
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_create
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_dictionary_threshold
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_empty_strings
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ends_with_nulls
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_char
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_date
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_decimal
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_varchar
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_split_elimination
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_vectorization_ppd
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_skewjoin
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_varchar_serde
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_mapjoin
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_left_outer_join
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_context
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_date_funcs
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_mapjoin
        org.apache.hcatalog.pig.TestOrcHCatLoader.testReadPartitionedBasic
        org.apache.hcatalog.pig.TestOrcHCatStorer.testStoreBasicTable
        org.apache.hcatalog.pig.TestOrcHCatStorer.testStorePartitionedTable
        org.apache.hcatalog.pig.TestOrcHCatStorer.testStoreTableMulti
        org.apache.hive.beeline.TestSchemaTool.testSchemaInit
        org.apache.hive.beeline.TestSchemaTool.testSchemaUpgrade
        org.apache.hive.hcatalog.pig.TestOrcHCatLoader.testReadPartitionedBasic
        org.apache.hive.hcatalog.pig.TestOrcHCatStorer.testStoreBasicTable
        org.apache.hive.hcatalog.pig.TestOrcHCatStorer.testStorePartitionedTable
        org.apache.hive.hcatalog.pig.TestOrcHCatStorer.testStoreTableMulti
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1636/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1636/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 34 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12632847

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12632847/HIVE-6060.patch ERROR: -1 due to 34 failed/errored test(s), 5383 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_char_serde org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_date_serde org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_04_evolved_parts org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_11_managed_external org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_12_external_location org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_13_managed_location org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_14_managed_location_over_existing org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_create org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_dictionary_threshold org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_empty_strings org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ends_with_nulls org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_char org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_date org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_decimal org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_ppd_varchar org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_split_elimination org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_vectorization_ppd org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_skewjoin org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_varchar_serde org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_mapjoin org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_left_outer_join org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_context org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_date_funcs org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_mapjoin org.apache.hcatalog.pig.TestOrcHCatLoader.testReadPartitionedBasic org.apache.hcatalog.pig.TestOrcHCatStorer.testStoreBasicTable org.apache.hcatalog.pig.TestOrcHCatStorer.testStorePartitionedTable org.apache.hcatalog.pig.TestOrcHCatStorer.testStoreTableMulti org.apache.hive.beeline.TestSchemaTool.testSchemaInit org.apache.hive.beeline.TestSchemaTool.testSchemaUpgrade org.apache.hive.hcatalog.pig.TestOrcHCatLoader.testReadPartitionedBasic org.apache.hive.hcatalog.pig.TestOrcHCatStorer.testStoreBasicTable org.apache.hive.hcatalog.pig.TestOrcHCatStorer.testStorePartitionedTable org.apache.hive.hcatalog.pig.TestOrcHCatStorer.testStoreTableMulti Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1636/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1636/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 34 tests failed This message is automatically generated. ATTACHMENT ID: 12632847
        Hide
        Sergey Shelukhin added a comment -

        is it possible to post rb?

        Show
        Sergey Shelukhin added a comment - is it possible to post rb?
        Hide
        Owen O'Malley added a comment -

        I'm not sure why it didn't link, but here:

        https://reviews.apache.org/r/18810/diff/

        Show
        Owen O'Malley added a comment - I'm not sure why it didn't link, but here: https://reviews.apache.org/r/18810/diff/
        Hide
        Sergey Shelukhin added a comment -

        I am doing a review slowly, got about 40% of the way thru so far. Note that I am not familiar with surrounding code so comments on RB are mostly low level.
        Is there a design doc somewhere that would describe the changes?
        Do I understand correctly that "originals" are files from current table structure, and that the patch requires filesystem structure change?

        Show
        Sergey Shelukhin added a comment - I am doing a review slowly, got about 40% of the way thru so far. Note that I am not familiar with surrounding code so comments on RB are mostly low level. Is there a design doc somewhere that would describe the changes? Do I understand correctly that "originals" are files from current table structure, and that the patch requires filesystem structure change?
        Hide
        Sergey Shelukhin added a comment -

        Left the comments on RB. I only skimmed the test changes... the previous questions above still remain

        Show
        Sergey Shelukhin added a comment - Left the comments on RB. I only skimmed the test changes... the previous questions above still remain
        Hide
        Prasanth Jayachandran added a comment -

        Owen O'Malley HIVE-6578 added support for partialscan and noscan support in analyze statement for ORC files. When analyze command with partial or noscan is executed, each partition directory is iterated, creating ORC readers for files under the each directory. Basic statistics like number of rows, file size, raw data size are computed by reading stats from ORC file footer. How does HIVE-5317 and HIVE-6060 changes affect HIVE-6578 way of stats gathering?

        Show
        Prasanth Jayachandran added a comment - Owen O'Malley HIVE-6578 added support for partialscan and noscan support in analyze statement for ORC files. When analyze command with partial or noscan is executed, each partition directory is iterated, creating ORC readers for files under the each directory. Basic statistics like number of rows, file size, raw data size are computed by reading stats from ORC file footer. How does HIVE-5317 and HIVE-6060 changes affect HIVE-6578 way of stats gathering?
        Hide
        Owen O'Malley added a comment -

        Addressed Sergey's comments.

        Show
        Owen O'Malley added a comment - Addressed Sergey's comments.
        Hide
        Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12634341/HIVE-6060.patch

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1761/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1761/console

        Messages:

        **** This message was trimmed, see log for full details ****
        Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:115:5: 
        Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:127:5: 
        Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:138:5: 
        Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:149:5: 
        Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:166:7: 
        Decision can match input such as "STAR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:179:5: 
        Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:179:5: 
        Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:179:5: 
        Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:261:5: 
        Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
        
        As a result, alternative(s) 3 were disabled for that input
        warning(200): IdentifiersParser.g:261:5: 
        Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:261:5: 
        Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:261:5: 
        Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:393:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:518:5: 
        Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
        
        As a result, alternative(s) 3 were disabled for that input
        [INFO] 
        [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-exec ---
        [INFO] 
        [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-exec ---
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] Copying 3 resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
        [INFO] Compiling 1654 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java:[285,13] cannot find symbol
        symbol  : variable ShimLoader
        location: class org.apache.hadoop.hive.ql.io.HiveInputFormat<K,V>
        [INFO] 1 error
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [10.764s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [7.949s]
        [INFO] Hive Shims Common ................................. SUCCESS [4.325s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [3.010s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [4.790s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [3.355s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [8.466s]
        [INFO] Hive Shims ........................................ SUCCESS [0.964s]
        [INFO] Hive Common ....................................... SUCCESS [34.341s]
        [INFO] Hive Serde ........................................ SUCCESS [12.707s]
        [INFO] Hive Metastore .................................... SUCCESS [33.360s]
        [INFO] Hive Query Language ............................... FAILURE [51.521s]
        [INFO] Hive Service ...................................... SKIPPED
        [INFO] Hive JDBC ......................................... SKIPPED
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 3:00.452s
        [INFO] Finished at: Thu Mar 13 03:15:35 EDT 2014
        [INFO] Final Memory: 72M/496M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java:[285,13] cannot find symbol
        [ERROR] symbol  : variable ShimLoader
        [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveInputFormat<K,V>
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-exec
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12634341

        Show
        Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12634341/HIVE-6060.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1761/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1761/console Messages: **** This message was trimmed, see log for full details **** Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:115:5: Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:127:5: Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:138:5: Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:149:5: Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:166:7: Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:179:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:179:5: Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:179:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:261:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:393:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:518:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-exec --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-exec --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1654 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java:[285,13] cannot find symbol symbol : variable ShimLoader location: class org.apache.hadoop.hive.ql.io.HiveInputFormat<K,V> [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [10.764s] [INFO] Hive Ant Utilities ................................ SUCCESS [7.949s] [INFO] Hive Shims Common ................................. SUCCESS [4.325s] [INFO] Hive Shims 0.20 ................................... SUCCESS [3.010s] [INFO] Hive Shims Secure Common .......................... SUCCESS [4.790s] [INFO] Hive Shims 0.20S .................................. SUCCESS [3.355s] [INFO] Hive Shims 0.23 ................................... SUCCESS [8.466s] [INFO] Hive Shims ........................................ SUCCESS [0.964s] [INFO] Hive Common ....................................... SUCCESS [34.341s] [INFO] Hive Serde ........................................ SUCCESS [12.707s] [INFO] Hive Metastore .................................... SUCCESS [33.360s] [INFO] Hive Query Language ............................... FAILURE [51.521s] [INFO] Hive Service ...................................... SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 3:00.452s [INFO] Finished at: Thu Mar 13 03:15:35 EDT 2014 [INFO] Final Memory: 72M/496M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java:[285,13] cannot find symbol [ERROR] symbol : variable ShimLoader [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveInputFormat<K,V> [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12634341
        Hide
        Owen O'Malley added a comment -

        Fix compilation conflict with HIVE-6572 that just got committed.

        Show
        Owen O'Malley added a comment - Fix compilation conflict with HIVE-6572 that just got committed.
        Hide
        Owen O'Malley added a comment -

        Fixes a NPE when there are no rows in the input.

        Show
        Owen O'Malley added a comment - Fixes a NPE when there are no rows in the input.
        Hide
        Owen O'Malley added a comment -

        Re-upload for jenkins.

        Show
        Owen O'Malley added a comment - Re-upload for jenkins.
        Hide
        Hive QA added a comment -

        Overall: +1 all checks pass

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12634577/HIVE-6060.patch

        SUCCESS: +1 5392 tests passed

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1775/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1775/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        

        This message is automatically generated.

        ATTACHMENT ID: 12634577

        Show
        Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12634577/HIVE-6060.patch SUCCESS: +1 5392 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1775/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1775/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12634577
        Hide
        Sergey Shelukhin added a comment -

        some minor comments on RB. My main concern is that the feature is not well integrated. Vectorized code is not supported; will it fall back now, error out, or produce incorrect results? From discussion I understood it produces incorrect results. Is it possible to for it to fall back to non-vectorized?
        The other thing is that there's no way to go back. I'm not sure how bad it is... if it can only be used in narrow explicit scenarios I guess that should be ok.

        Show
        Sergey Shelukhin added a comment - some minor comments on RB. My main concern is that the feature is not well integrated. Vectorized code is not supported; will it fall back now, error out, or produce incorrect results? From discussion I understood it produces incorrect results. Is it possible to for it to fall back to non-vectorized? The other thing is that there's no way to go back. I'm not sure how bad it is... if it can only be used in narrow explicit scenarios I guess that should be ok.
        Hide
        Sergey Shelukhin added a comment -

        Probably JIRA is needed for future conversion tool, would be necessary for wider usage

        Show
        Sergey Shelukhin added a comment - Probably JIRA is needed for future conversion tool, would be necessary for wider usage
        Hide
        Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12634783/HIVE-6060.patch

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1793/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1793/console

        Messages:

        **** This message was trimmed, see log for full details ****
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources
        [INFO] Copying 3 resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
             [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi ---
        [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.14.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-hwi ---
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.14.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.14.0-SNAPSHOT/hive-hwi-0.14.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.14.0-SNAPSHOT/hive-hwi-0.14.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive ODBC 0.14.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-odbc ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/odbc (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-odbc ---
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
             [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-odbc ---
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-odbc ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/odbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-odbc/0.14.0-SNAPSHOT/hive-odbc-0.14.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims Aggregator 0.14.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-aggregator ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-shims-aggregator ---
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
             [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-shims-aggregator ---
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-aggregator ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims-aggregator/0.14.0-SNAPSHOT/hive-shims-aggregator-0.14.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive TestUtils 0.14.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-testutils ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/testutils (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-testutils ---
        [INFO] 
        [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-testutils ---
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources
        [INFO] Copying 3 resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils ---
        [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/classes
        [INFO] 
        [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hive-testutils ---
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources
        [INFO] Copying 3 resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
             [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-testutils ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.14.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-testutils ---
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-testutils ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.14.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.14.0-SNAPSHOT/hive-testutils-0.14.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.14.0-SNAPSHOT/hive-testutils-0.14.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Packaging 0.14.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/maven-metadata.xml
        Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/hive-hcatalog-hbase-storage-handler-0.14.0-SNAPSHOT.pom
        [WARNING] The POM for org.apache.hive.hcatalog:hive-hcatalog-hbase-storage-handler:jar:0.14.0-SNAPSHOT is missing, no dependency information available
        Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/hive-hcatalog-hbase-storage-handler-0.14.0-SNAPSHOT.jar
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [8.738s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [5.485s]
        [INFO] Hive Shims Common ................................. SUCCESS [3.616s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [2.557s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [4.265s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [2.994s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [7.812s]
        [INFO] Hive Shims ........................................ SUCCESS [0.890s]
        [INFO] Hive Common ....................................... SUCCESS [6.844s]
        [INFO] Hive Serde ........................................ SUCCESS [12.393s]
        [INFO] Hive Metastore .................................... SUCCESS [32.374s]
        [INFO] Hive Query Language ............................... SUCCESS [1:29.604s]
        [INFO] Hive Service ...................................... SUCCESS [9.606s]
        [INFO] Hive JDBC ......................................... SUCCESS [3.045s]
        [INFO] Hive Beeline ...................................... SUCCESS [3.072s]
        [INFO] Hive CLI .......................................... SUCCESS [1.919s]
        [INFO] Hive Contrib ...................................... SUCCESS [2.425s]
        [INFO] Hive HBase Handler ................................ SUCCESS [2.796s]
        [INFO] Hive HCatalog ..................................... SUCCESS [0.656s]
        [INFO] Hive HCatalog Core ................................ SUCCESS [2.371s]
        [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [2.297s]
        [INFO] Hive HCatalog Server Extensions ................... SUCCESS [2.005s]
        [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [1.515s]
        [INFO] Hive HCatalog Webhcat ............................. SUCCESS [10.225s]
        [INFO] Hive HWI .......................................... SUCCESS [1.298s]
        [INFO] Hive ODBC ......................................... SUCCESS [0.746s]
        [INFO] Hive Shims Aggregator ............................. SUCCESS [0.257s]
        [INFO] Hive TestUtils .................................... SUCCESS [0.658s]
        [INFO] Hive Packaging .................................... FAILURE [0.809s]
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 3:47.992s
        [INFO] Finished at: Sat Mar 15 03:16:53 EDT 2014
        [INFO] Final Memory: 74M/520M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal on project hive-packaging: Could not resolve dependencies for project org.apache.hive:hive-packaging:pom:0.14.0-SNAPSHOT: Could not find artifact org.apache.hive.hcatalog:hive-hcatalog-hbase-storage-handler:jar:0.14.0-SNAPSHOT in apache.snapshots (http://repository.apache.org/snapshots) -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-packaging
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12634783

        Show
        Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12634783/HIVE-6060.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1793/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1793/console Messages: **** This message was trimmed, see log for full details **** [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hwi/src/test/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hwi --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.14.0-SNAPSHOT.jar [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-hwi --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/target/hive-hwi-0.14.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.14.0-SNAPSHOT/hive-hwi-0.14.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.14.0-SNAPSHOT/hive-hwi-0.14.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive ODBC 0.14.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-odbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/odbc (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-odbc --- [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/odbc/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-odbc --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-odbc --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/odbc/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-odbc/0.14.0-SNAPSHOT/hive-odbc-0.14.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Aggregator 0.14.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-aggregator --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-shims-aggregator --- [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggregator --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggregator --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-shims-aggregator --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-aggregator --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims-aggregator/0.14.0-SNAPSHOT/hive-shims-aggregator-0.14.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive TestUtils 0.14.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-testutils --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/testutils (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-testutils --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-testutils --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/main/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testutils --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/classes [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hive-testutils --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/testutils/src/test/resources [INFO] Copying 3 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-testutils --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-testutils --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.14.0-SNAPSHOT.jar [INFO] [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hive-testutils --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-testutils --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/target/hive-testutils-0.14.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.14.0-SNAPSHOT/hive-testutils-0.14.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutils/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/0.14.0-SNAPSHOT/hive-testutils-0.14.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Packaging 0.14.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/maven-metadata.xml Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/hive-hcatalog-hbase-storage-handler-0.14.0-SNAPSHOT.pom [WARNING] The POM for org.apache.hive.hcatalog:hive-hcatalog-hbase-storage-handler:jar:0.14.0-SNAPSHOT is missing, no dependency information available Downloading: http://repository.apache.org/snapshots/org/apache/hive/hcatalog/hive-hcatalog-hbase-storage-handler/0.14.0-SNAPSHOT/hive-hcatalog-hbase-storage-handler-0.14.0-SNAPSHOT.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [8.738s] [INFO] Hive Ant Utilities ................................ SUCCESS [5.485s] [INFO] Hive Shims Common ................................. SUCCESS [3.616s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.557s] [INFO] Hive Shims Secure Common .......................... SUCCESS [4.265s] [INFO] Hive Shims 0.20S .................................. SUCCESS [2.994s] [INFO] Hive Shims 0.23 ................................... SUCCESS [7.812s] [INFO] Hive Shims ........................................ SUCCESS [0.890s] [INFO] Hive Common ....................................... SUCCESS [6.844s] [INFO] Hive Serde ........................................ SUCCESS [12.393s] [INFO] Hive Metastore .................................... SUCCESS [32.374s] [INFO] Hive Query Language ............................... SUCCESS [1:29.604s] [INFO] Hive Service ...................................... SUCCESS [9.606s] [INFO] Hive JDBC ......................................... SUCCESS [3.045s] [INFO] Hive Beeline ...................................... SUCCESS [3.072s] [INFO] Hive CLI .......................................... SUCCESS [1.919s] [INFO] Hive Contrib ...................................... SUCCESS [2.425s] [INFO] Hive HBase Handler ................................ SUCCESS [2.796s] [INFO] Hive HCatalog ..................................... SUCCESS [0.656s] [INFO] Hive HCatalog Core ................................ SUCCESS [2.371s] [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [2.297s] [INFO] Hive HCatalog Server Extensions ................... SUCCESS [2.005s] [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [1.515s] [INFO] Hive HCatalog Webhcat ............................. SUCCESS [10.225s] [INFO] Hive HWI .......................................... SUCCESS [1.298s] [INFO] Hive ODBC ......................................... SUCCESS [0.746s] [INFO] Hive Shims Aggregator ............................. SUCCESS [0.257s] [INFO] Hive TestUtils .................................... SUCCESS [0.658s] [INFO] Hive Packaging .................................... FAILURE [0.809s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 3:47.992s [INFO] Finished at: Sat Mar 15 03:16:53 EDT 2014 [INFO] Final Memory: 74M/520M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal on project hive-packaging: Could not resolve dependencies for project org.apache.hive:hive-packaging:pom:0.14.0-SNAPSHOT: Could not find artifact org.apache.hive.hcatalog:hive-hcatalog-hbase-storage-handler:jar:0.14.0-SNAPSHOT in apache.snapshots (http://repository.apache.org/snapshots) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-packaging + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12634783
        Hide
        Owen O'Malley added a comment -

        Addressed Sergey's final comments and changed the semantics for CombineHiveInputFormat and vectorization so that both will throw exceptions if they are applied to ACID tables.

        I'll file a follow up jira to fix vectorization for Hive 0.13.

        Show
        Owen O'Malley added a comment - Addressed Sergey's final comments and changed the semantics for CombineHiveInputFormat and vectorization so that both will throw exceptions if they are applied to ACID tables. I'll file a follow up jira to fix vectorization for Hive 0.13.
        Hide
        Sergey Shelukhin added a comment -

        tiny comment on RB, can be fixed on commit. Otherwise +1

        Show
        Sergey Shelukhin added a comment - tiny comment on RB, can be fixed on commit. Otherwise +1
        Hide
        Jitendra Nath Pandey added a comment -

        OrcInputFormat#getRecordReader must check for vectorized mode before returning any reader. It seems this patch has moved the check down which introduces a scenario where non-vectorized record reader will be returned in vectorized mode, which would cause the query to fail.

        Show
        Jitendra Nath Pandey added a comment - OrcInputFormat#getRecordReader must check for vectorized mode before returning any reader. It seems this patch has moved the check down which introduces a scenario where non-vectorized record reader will be returned in vectorized mode, which would cause the query to fail.
        Hide
        Sergey Shelukhin added a comment -

        and the above bugfix...

        Show
        Sergey Shelukhin added a comment - and the above bugfix...
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12635646/HIVE-6060.patch

        ERROR: -1 due to 46 failed/errored test(s), 5470 tests executed
        Failed tests:

        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_coalesce
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_aggregate
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_cast
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_expressions
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_mapjoin
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_math_funcs
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_left_outer_join
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_0
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_1
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_10
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_11
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_12
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_13
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_14
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_15
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_16
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_2
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_3
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_4
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_5
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_6
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_7
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_8
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_9
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_decimal_date
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_div0
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_limit
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_nested_udf
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_not
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_part
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_part_project
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_pushdown
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_short_regress
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_case
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_casts
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_context
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_date_funcs
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_mapjoin
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_math_funcs
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_shufflejoin
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_string_funcs
        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_timestamp_funcs
        org.apache.hadoop.hive.cli.TestCompareCliDriver.testCompareCliDriver_vectorized_math_funcs
        org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16
        org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_bucketmapjoin6
        org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_leftsemijoin_mr
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1910/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1910/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 46 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12635646

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12635646/HIVE-6060.patch ERROR: -1 due to 46 failed/errored test(s), 5470 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_coalesce org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_aggregate org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_cast org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_expressions org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_mapjoin org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_math_funcs org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_left_outer_join org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_0 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_1 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_10 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_11 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_12 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_13 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_14 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_15 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_16 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_2 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_3 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_4 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_5 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_6 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_7 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_8 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_9 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_decimal_date org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_div0 org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_limit org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_nested_udf org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_not org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_part org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_part_project org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_pushdown org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorization_short_regress org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_case org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_casts org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_context org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_date_funcs org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_mapjoin org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_math_funcs org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_shufflejoin org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_string_funcs org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vectorized_timestamp_funcs org.apache.hadoop.hive.cli.TestCompareCliDriver.testCompareCliDriver_vectorized_math_funcs org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16 org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_bucketmapjoin6 org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_leftsemijoin_mr Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1910/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1910/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 46 tests failed This message is automatically generated. ATTACHMENT ID: 12635646
        Hide
        Owen O'Malley added a comment -

        Fixed regression with respect to vectorization and made change that Sergey asked for.

        Show
        Owen O'Malley added a comment - Fixed regression with respect to vectorization and made change that Sergey asked for.
        Hide
        Alan Gates added a comment -

        Ran the tests locally and didn't see any failures.

        Show
        Alan Gates added a comment - Ran the tests locally and didn't see any failures.
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12636790/HIVE-6060.patch

        ERROR: -1 due to 1 failed/errored test(s), 5487 tests executed
        Failed tests:

        org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_root_dir_external_table
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1963/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1963/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 1 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12636790

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12636790/HIVE-6060.patch ERROR: -1 due to 1 failed/errored test(s), 5487 tests executed Failed tests: org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_root_dir_external_table Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1963/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1963/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12636790
        Hide
        Jitendra Nath Pandey added a comment -

        +1 for the latest patch.

        Show
        Jitendra Nath Pandey added a comment - +1 for the latest patch.
        Hide
        Owen O'Malley added a comment -

        Thanks for the reviews, Jitendra and Sergey! I just committed this to 0.13 and trunk.

        Show
        Owen O'Malley added a comment - Thanks for the reviews, Jitendra and Sergey! I just committed this to 0.13 and trunk.

          People

          • Assignee:
            Owen O'Malley
            Reporter:
            Owen O'Malley
          • Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development