Hive
  1. Hive
  2. HIVE-3286

Explicit skew join on user provided condition

    Details

    • Type: Improvement Improvement
    • Status: Patch Available
    • Priority: Minor Minor
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Query Processor
    • Labels:
      None

      Description

      Join operation on table with skewed data takes most of execution time handling the skewed keys. But mostly we already know about that and even know what is look like the skewed keys.

      If we can explicitly assign reducer slots for the skewed keys, total execution time could be greatly shortened.

      As for a start, I've extended join grammar something like this.

      select * from src a join src b on a.key=b.key skew on (a.key+1 < 50, a.key+1 < 100, a.key < 150);
      

      which means if above query is executed by 20 reducers, one reducer for a.key+1 < 50, one reducer for 50 <= a.key+1 < 100, one reducer for 99 <= a.key < 150, and 17 reducers for others (could be extended to assign more than one reducer later)

      This can be only used with common-inner-equi joins. And skew condition should be composed of join keys only.

      Work till done now will be updated shortly after code cleanup.

      ----------------------------

      All expressions in the clause "SKEW ON (expr1, expr2, ...)" are called skew condition and consist of skew expression*, which is simple boolean expression for the group and optional CLUSTER/DISTRIBUTED expression. Skew expressions will be evaluated sequentially at runtime, deciding skew group for a row. Each skew group has reserved partition slot(s), to which all rows in a group would be assigned.

      The number of partition slot reserved for each skew group is decided by cluster expression. Before submitting the MR job, hive calculates size of each skew groups. If a skew group is "CLUSTER BY 20 PERCENT" and total partition slot (=number of reducer) is, say, 20, the group will reserve 4 partition slots for it, etc.

      The optional "DISTRIBUTE BY" decides how the rows in a skew group is dispersed in the range of reserved slots (If there is only one slot for a group, this is meaningless). Currently, three distribution policies are available: RANDOM, KEYS, <expression>.
      1. RANDOM : rows from driver alias** are dispersed by random and rows of other aliases are multicasted to all slots (default behavior)
      2. KEYS : rows from driver alias are dispersed by hash of keys same for
      3. expression : determined by evaluation result of user-provided expression

      Only possible with inner, equi, common-joins. Not yet supports join tree merging or vectorization.

      • Might be used by other RS users like "SORT BY" or "GROUP BY" (not-yet)
      • If there are column statistics for the skewness of the key, it could be possible applied automatically (not-yet)

      For example, if 20 reducers are used for the query below,

      select count(*) from src a join src b on a.key=b.key skew on (
         a.key = '0' CLUSTER BY 10 PERCENT,
         b.key < '100' CLUSTER BY 20 PERCENT DISTRIBUTE BY upper(b.key),
         cast(a.key as int) > 300 CLUSTER BY 40 PERCENT DISTRIBUTE BY KEYS);
      

      Skew group-0 would reserve 2 slots (#6~#7), for group-1, 4 slots (#8~#11), for group-2, 8 slots (#12~#19) and others will use remaining 6 slots (#0~#5).

      For key='0' from alias a(driver alias), it will be assigned to a slot of group-0 : 6 or 7
      For key='0' from alias b(non-driver alias), it will be multicasted to all slots of group-0 : 6 and 7

      For key='50' from alias a(non-driver alias), it will be multicasted to all slots of group-1 : 8 and 9 and 10 and 11
      For key='50' from alias b(driver alias), it will be assigned to a slot of group-1 : 8 or 9 or 10 or 11

      For key='500' from alias a(driver alias), it will be assigned to a slot of group-2 by modulation of hashcode of DISTRIBUTE expression
      For key='500' from alias b(non-driver alias), it will be multicasted to all slots of group-2 : #12~#19

      For key='200', it's not belong to any skew group and will be processed normally in the range of partition slot 0~5.

      *skew expression :
      1. all expressions should be made of expression in join condition, which means if join condition is "a.key=b.key", user can make any expression with "a.key" or "b.key". But if join condition is a.key+1=b.key, user cannot make expression with "a.key" solely (or make expression with "a.key+1").
      2. all expressions should reference one and only-one side of aliases. For example, simple constant expressions or expressions referencing both side of join condition ("a.key+b.key<100") is not allowed.
      3. all functions in expression should be deterministic and stateless.
      4. DISTRIBUTED expression should have same alias with skew expression.

      **driver alias :
      1. driver alias means the sole referenced alias from skew expression, which is important for RANDOM distribution. Rows from driver alias are assigned to one slot, but rows from other aliases will be multicasted to all slots of the group.

      1. HIVE-3286.D4287.9.patch
        107 kB
        Phabricator
      2. HIVE-3286.D4287.8.patch
        81 kB
        Phabricator
      3. HIVE-3286.D4287.7.patch
        82 kB
        Phabricator
      4. HIVE-3286.D4287.6.patch
        60 kB
        Phabricator
      5. HIVE-3286.D4287.5.patch
        61 kB
        Phabricator
      6. HIVE-3286.D4287.10.patch
        103 kB
        Phabricator
      7. HIVE-3286.19.patch.txt
        103 kB
        Navis
      8. HIVE-3286.18.patch.txt
        101 kB
        Navis
      9. HIVE-3286.17.patch.txt
        108 kB
        Navis
      10. HIVE-3286.16.patch.txt
        108 kB
        Navis
      11. HIVE-3286.15.patch.txt
        108 kB
        Navis
      12. HIVE-3286.14.patch.txt
        108 kB
        Navis
      13. HIVE-3286.13.patch.txt
        108 kB
        Navis
      14. HIVE-3286.12.patch.txt
        108 kB
        Navis
      15. D4287.11.patch
        105 kB
        Phabricator

        Activity

        Hide
        Namit Jain added a comment -

        Navis, Nadeem is already working on this in a different approach
        https://cwiki.apache.org/Hive/skewed-join-optimization.html

        I am not sure if there is a jira, but I know he is pretty close to getting one out.

        Show
        Namit Jain added a comment - Navis, Nadeem is already working on this in a different approach https://cwiki.apache.org/Hive/skewed-join-optimization.html I am not sure if there is a jira, but I know he is pretty close to getting one out.
        Hide
        Navis added a comment -

        The idea of this issue was conceived couple of months ago and I've seen the document later. I love the systemic approach in it.

        I've considered using that but decided to implement this cause this seemed to allow more freedom in schema(without list bucketing).
        If this is not appropriate for hive, I can keep this only for internal use.

        Show
        Navis added a comment - The idea of this issue was conceived couple of months ago and I've seen the document later. I love the systemic approach in it. I've considered using that but decided to implement this cause this seemed to allow more freedom in schema(without list bucketing). If this is not appropriate for hive, I can keep this only for internal use.
        Hide
        Nadeem Moidu added a comment -

        Here is the other JIRA https://issues.apache.org/jira/browse/HIVE-3086 .
        I'm not sure if using the list bucketing schema constricts you (the phrase "list bucketed by" has been removed from it). List bucketing was anyway solving a problem caused by skew, so there was no point expecting the user to give the skew information more than once.

        This seems to be solving a slightly different problem, e.g. I don't allow ranges.

        Show
        Nadeem Moidu added a comment - Here is the other JIRA https://issues.apache.org/jira/browse/HIVE-3086 . I'm not sure if using the list bucketing schema constricts you (the phrase "list bucketed by" has been removed from it). List bucketing was anyway solving a problem caused by skew, so there was no point expecting the user to give the skew information more than once. This seems to be solving a slightly different problem, e.g. I don't allow ranges.
        Hide
        Navis added a comment -

        https://reviews.facebook.net/D4287

        Slightly upgraded grammar, for example

        select * from src a join src b on a.key=b.key skew on
          (a.key+1 < 50 SKEWED BY 30 PERCENT DISTRIBUTE BY a.key-1,
           a.key+1 < 100 SKEWED BY 20 PERCENT,
           a.key < 150);
        
        Show
        Navis added a comment - https://reviews.facebook.net/D4287 Slightly upgraded grammar, for example select * from src a join src b on a.key=b.key skew on (a.key+1 < 50 SKEWED BY 30 PERCENT DISTRIBUTE BY a.key-1, a.key+1 < 100 SKEWED BY 20 PERCENT, a.key < 150);
        Hide
        Namit Jain added a comment -

        @Navis, can you explain the semantics of the above grammar ?
        What doe SKEWED BY, DISTRIBUTE BY imply ?

        Also, in the base case:

        select * from src a join src b on a.key=b.key skew on (a.key+1 < 50, a.key+1 < 100, a.key < 150);

        are you expecting skewed keys for key <= 49.
        Is it true that the skewed keys will only be handled by reducers ?
        If yes, why would it reduce the execution time ? The main advantage should be that reducer wont get any other key, so
        wont be burdened. Is that the idea ?

        Show
        Namit Jain added a comment - @Navis, can you explain the semantics of the above grammar ? What doe SKEWED BY, DISTRIBUTE BY imply ? Also, in the base case: select * from src a join src b on a.key=b.key skew on (a.key+1 < 50, a.key+1 < 100, a.key < 150); are you expecting skewed keys for key <= 49. Is it true that the skewed keys will only be handled by reducers ? If yes, why would it reduce the execution time ? The main advantage should be that reducer wont get any other key, so wont be burdened. Is that the idea ?
        Hide
        Navis added a comment -

        This is for assigning some number of reducers exclusively for a key (or group of keys).

        "SKEWED BY 30 PERCENT" means if the total number of reducer for MR is 20, hive assign 20*0.3=6 reducers for the group. If not specified, one reducer is assigned for that group. For above example, resultant partition number of group 1 is distributed in the range of 12~17, group 2 is 18, group 3 is 19, and remaining keys are distributed in the range of 0~11.

        "DISTRIBUTED BY a.key-1" means if partition range is more than 1(like group 1), distribution in the range(12~17) is based on hash of evaluated value by the expression 1.key-1. I think this is not yet enough for real usage.

        Show
        Navis added a comment - This is for assigning some number of reducers exclusively for a key (or group of keys). "SKEWED BY 30 PERCENT" means if the total number of reducer for MR is 20, hive assign 20*0.3=6 reducers for the group. If not specified, one reducer is assigned for that group. For above example, resultant partition number of group 1 is distributed in the range of 12~17, group 2 is 18, group 3 is 19, and remaining keys are distributed in the range of 0~11. "DISTRIBUTED BY a.key-1" means if partition range is more than 1(like group 1), distribution in the range(12~17) is based on hash of evaluated value by the expression 1.key-1. I think this is not yet enough for real usage.
        Hide
        Namit Jain added a comment -

        // distributes randomly, disperses non driving aliases to all partitions in the skew group
        public static final int SKEW_RULE_RANDOM = 0;

        Why is this needed ?

        I mean, wont it be very expensive ?

        2. KEYS : determined by hash value of keys (same with previous)
        3. expression : determined by hash of object evaluated by user-provided expression

        Wont the above 2 always lead to the same expression ?

        Basically, why is distribute by needed at all ? Cant we always use the KEYS semantics ?
        This seems too confusing.

        Show
        Namit Jain added a comment - // distributes randomly, disperses non driving aliases to all partitions in the skew group public static final int SKEW_RULE_RANDOM = 0; Why is this needed ? I mean, wont it be very expensive ? 2. KEYS : determined by hash value of keys (same with previous) 3. expression : determined by hash of object evaluated by user-provided expression Wont the above 2 always lead to the same expression ? Basically, why is distribute by needed at all ? Cant we always use the KEYS semantics ? This seems too confusing.
        Hide
        Namit Jain added a comment -

        Otherwise, I think this is generic and is useful for Hive.

        Show
        Namit Jain added a comment - Otherwise, I think this is generic and is useful for Hive.
        Hide
        Navis added a comment -

        Most of our queries are driven by big table (500G~) joining small tables (~10G) and big table is heavily skewed with one or a few keys(more than 60%). In this case RANDOM distribution would be very useful in spite of additional cost of duplication. And for small tables, there is not that much for the keys, which minimizes overload of duplication. I'll post test results later if possible.

        KEYS is distribution by join keys. EXPRESSIONs can be different from that, though should be composed of join keys. I also think this is not so useful option and even removed once. But I added it in final version for just in case.

        Show
        Navis added a comment - Most of our queries are driven by big table (500G~) joining small tables (~10G) and big table is heavily skewed with one or a few keys(more than 60%). In this case RANDOM distribution would be very useful in spite of additional cost of duplication. And for small tables, there is not that much for the keys, which minimizes overload of duplication. I'll post test results later if possible. KEYS is distribution by join keys. EXPRESSIONs can be different from that, though should be composed of join keys. I also think this is not so useful option and even removed once. But I added it in final version for just in case.
        Hide
        Namit Jain added a comment -

        But, if you know the skewed keys, cant you create a group for each of the skewed key ?

        Show
        Namit Jain added a comment - But, if you know the skewed keys, cant you create a group for each of the skewed key ?
        Hide
        Navis added a comment -

        I cannot understand exactly what you said 'create group for each key'. If it's something described in https://cwiki.apache.org/Hive/skewed-join-optimization.html, I should say it's too difficult for developers who would translate sqls for rdbms to hsql(that's what I'm supporting).

        Show
        Navis added a comment - I cannot understand exactly what you said 'create group for each key'. If it's something described in https://cwiki.apache.org/Hive/skewed-join-optimization.html , I should say it's too difficult for developers who would translate sqls for rdbms to hsql(that's what I'm supporting).
        Hide
        Namit Jain added a comment -

        I am sorry, I was not clear, what I meant was the following:

        If you know keys 10, 25 and 40 are skewed, accounting for nearly 5%, 6% and 7% of data respectively,
        can't you issue the following

        select count from src a join src b on a.key=b.key skew on (
        a.key = '10' CLUSTER BY 5 PERCENT,
        a.key = '25' CLUSTER BY 6 PERCENT,
        a.key = '40' CLUSTER BY 7 PERCENT);

        I am not clear on why do you need DISTRIBUTE BY ?

        KEYS and EXPRESSIONS should lead to the same distribution.
        Isn't that right ?

        I am sorry, can you give a clear example of where you see the benefit of using DISTRIBUTE BY ?

        Show
        Namit Jain added a comment - I am sorry, I was not clear, what I meant was the following: If you know keys 10, 25 and 40 are skewed, accounting for nearly 5%, 6% and 7% of data respectively, can't you issue the following select count from src a join src b on a.key=b.key skew on ( a.key = '10' CLUSTER BY 5 PERCENT, a.key = '25' CLUSTER BY 6 PERCENT, a.key = '40' CLUSTER BY 7 PERCENT); I am not clear on why do you need DISTRIBUTE BY ? KEYS and EXPRESSIONS should lead to the same distribution. Isn't that right ? I am sorry, can you give a clear example of where you see the benefit of using DISTRIBUTE BY ?
        Hide
        Navis added a comment -

        Default is random, which should duplicate rows of small tables.

        When skewing for each key is not so severe, distributing by join key can be enough without duplicating row. For example, logs of last two hour is multiple of others, user can make a group for them and distribute them again by join key.

        select ~~ from logs join errors on logs.hour=errors.hour AND logs.error_seq = errors.error_seq
            skew on (last_two_hour(logs.hour) cluster by 60 PERCENT DISTRIBUTE BY KEYS)
        

        DISTRIBUTE BY <expression> can provides more control on key distribution. In above case, "DISTRIBUTE BY logs.error_seq" can be used if it would result better distribution.

        After writing to here, I've found it's not so useful. Should I remove it?

        Show
        Navis added a comment - Default is random, which should duplicate rows of small tables. When skewing for each key is not so severe, distributing by join key can be enough without duplicating row. For example, logs of last two hour is multiple of others, user can make a group for them and distribute them again by join key. select ~~ from logs join errors on logs.hour=errors.hour AND logs.error_seq = errors.error_seq skew on (last_two_hour(logs.hour) cluster by 60 PERCENT DISTRIBUTE BY KEYS) DISTRIBUTE BY <expression> can provides more control on key distribution. In above case, "DISTRIBUTE BY logs.error_seq" can be used if it would result better distribution. After writing to here, I've found it's not so useful. Should I remove it?
        Hide
        Namit Jain added a comment -

        Ya, why dont you remove it ?

        Show
        Namit Jain added a comment - Ya, why dont you remove it ?
        Hide
        Namit Jain added a comment - - edited

        Comments on phabricator

        Show
        Namit Jain added a comment - - edited Comments on phabricator
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".
        Reviewers: JIRA

        Addressed comments

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        AFFECTED FILES
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/test/queries/clientpositive/skewjoin_explict.q
        ql/src/test/results/clientpositive/skewjoin_explict.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Reviewers: JIRA Addressed comments REVISION DETAIL https://reviews.facebook.net/D4287 AFFECTED FILES ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/test/queries/clientpositive/skewjoin_explict.q ql/src/test/results/clientpositive/skewjoin_explict.q.out To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        njain has commented on the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        1. Shouldnt you give an error for outer joins ?
        2. I think there used to be an optimization in place, where group followed by join on the same key
        did not require an extra reducer (reduce.dedup - or something like that). Can you add a test for
        that and make sure it works in that case.

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - njain has commented on the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". 1. Shouldnt you give an error for outer joins ? 2. I think there used to be an optimization in place, where group followed by join on the same key did not require an extra reducer (reduce.dedup - or something like that). Can you add a test for that and make sure it works in that case. REVISION DETAIL https://reviews.facebook.net/D4287 To: JIRA, navis Cc: njain
        Hide
        Namit Jain added a comment -

        Comments on phbaricator -

        Navis, can you refresh. do some cleanups - address comments.
        This would be really useful.

        remove the syntax:

        a.key = 0 CLUSTER BY 2 PARTITIONS,

        What does the above mean – not clear.

        Also, can you restructure the code in such a way, that in future if histogram
        data is available for a table (like skewed data), we should be able to convert the
        join to use this ? I mean, this data instead of coming from the query, can come from
        the table metadata.

        Show
        Namit Jain added a comment - Comments on phbaricator - Navis , can you refresh. do some cleanups - address comments. This would be really useful. remove the syntax: a.key = 0 CLUSTER BY 2 PARTITIONS, What does the above mean – not clear. Also, can you restructure the code in such a way, that in future if histogram data is available for a table (like skewed data), we should be able to convert the join to use this ? I mean, this data instead of coming from the query, can come from the table metadata.
        Hide
        Namit Jain added a comment -

        Gang Tim Liu, it would be useful if you can come up with a way to store the histogram data for a table.
        The skew join should be automatically able to use that.

        Show
        Namit Jain added a comment - Gang Tim Liu , it would be useful if you can come up with a way to store the histogram data for a table. The skew join should be automatically able to use that.
        Hide
        Phabricator added a comment -

        navis has commented on the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        1. I think this is a kind of join hint. So just disabled if it's not possible (outer join, invalid expression, etc.).
        2. RS dedup does not applied when child RS is for GBY or JOIN. Test for JOIN+SORTBY case will be added.

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis has commented on the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". 1. I think this is a kind of join hint. So just disabled if it's not possible (outer join, invalid expression, etc.). 2. RS dedup does not applied when child RS is for GBY or JOIN. Test for JOIN+SORTBY case will be added. REVISION DETAIL https://reviews.facebook.net/D4287 To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        navis has commented on the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        RS.dedup does not applied to JOIN-RS case either. Then, could it be enough to remove explicit assigning partition number (a.key = 0 CLUSTER BY 2 PARTITIONS, as you mentioned)?

        And.. creating a optimizer for skew join would be a really good thing (and also had intent to do it). I think current code base could be simply copied to the optimizer and it seemed not so hard.

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis has commented on the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". RS.dedup does not applied to JOIN-RS case either. Then, could it be enough to remove explicit assigning partition number (a.key = 0 CLUSTER BY 2 PARTITIONS, as you mentioned)? And.. creating a optimizer for skew join would be a really good thing (and also had intent to do it). I think current code base could be simply copied to the optimizer and it seemed not so hard. REVISION DETAIL https://reviews.facebook.net/D4287 To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".
        Reviewers: JIRA

        Rebased to trunk
        Removed explicit assigning

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        AFFECTED FILES
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/test/queries/clientpositive/skewjoin_explict.q
        ql/src/test/results/clientpositive/skewjoin_explict.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Reviewers: JIRA Rebased to trunk Removed explicit assigning REVISION DETAIL https://reviews.facebook.net/D4287 AFFECTED FILES ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/test/queries/clientpositive/skewjoin_explict.q ql/src/test/results/clientpositive/skewjoin_explict.q.out To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        njain has commented on the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        Cool, can you move it to a optimization step ?
        That way, we can also drive it from the table metadata.

        INLINE COMMENTS
        ql/src/test/queries/clientpositive/skewjoin_explict.q:4 The user should not be setting the partitioner.

        for SKEWED ON syntax, the partitioner should be automatically chosen
        ql/src/test/queries/clientpositive/skewjoin_explict.q:63 Add some sub-queries in the tests

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - njain has commented on the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Cool, can you move it to a optimization step ? That way, we can also drive it from the table metadata. INLINE COMMENTS ql/src/test/queries/clientpositive/skewjoin_explict.q:4 The user should not be setting the partitioner. for SKEWED ON syntax, the partitioner should be automatically chosen ql/src/test/queries/clientpositive/skewjoin_explict.q:63 Add some sub-queries in the tests REVISION DETAIL https://reviews.facebook.net/D4287 To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".
        Reviewers: JIRA

        1. Added stub for optimizer (not completed cause I cannot imagine how the histogram look like)
        2. Skew partitioner is applied automatically
        3. Added test case

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        AFFECTED FILES
        ql/src/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/test/queries/clientpositive/skewjoin_explict.q
        ql/src/test/results/clientpositive/skewjoin_explict.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Reviewers: JIRA 1. Added stub for optimizer (not completed cause I cannot imagine how the histogram look like) 2. Skew partitioner is applied automatically 3. Added test case REVISION DETAIL https://reviews.facebook.net/D4287 AFFECTED FILES ql/src/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/test/queries/clientpositive/skewjoin_explict.q ql/src/test/results/clientpositive/skewjoin_explict.q.out To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".
        Reviewers: JIRA

        fix mixed merge. my bad.

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        AFFECTED FILES
        ql/src/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/test/queries/clientpositive/skewjoin_explict.q
        ql/src/test/results/clientpositive/skewjoin_explict.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Reviewers: JIRA fix mixed merge. my bad. REVISION DETAIL https://reviews.facebook.net/D4287 AFFECTED FILES ql/src/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/test/queries/clientpositive/skewjoin_explict.q ql/src/test/results/clientpositive/skewjoin_explict.q.out To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        njain has commented on the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        1. Added stub for optimizer (not completed cause I cannot imagine how the histogram look like)

        >> call the optimizer. Come up with any definition. This may change over time, but if you dont
        do this now, it will never get integrated with the table metadata.

        INLINE COMMENTS
        ql/src/test/queries/clientpositive/skewjoin_explict.q:34 For all the negative stuff, it would be much simpler if an error is thrown instead of
        silently ignoring the hint. It becomes much easier to debug/enforce in production
        environment.
        ql/src/test/queries/clientpositive/skewjoin_explict.q:24 Can you add some tests which select some columns:

        select .. from
        (subq1 involving skewed join) s1
        join
        (subq2 involving skewed join) s2
        on join;

        Add some tests with auto-convert join to true (both where the map-join is picked and
        the map-join is not picked). Ideally, for map-join, the skew should matter.
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java:51 Is this even being called ?

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - njain has commented on the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". 1. Added stub for optimizer (not completed cause I cannot imagine how the histogram look like) >> call the optimizer. Come up with any definition. This may change over time, but if you dont do this now, it will never get integrated with the table metadata. INLINE COMMENTS ql/src/test/queries/clientpositive/skewjoin_explict.q:34 For all the negative stuff, it would be much simpler if an error is thrown instead of silently ignoring the hint. It becomes much easier to debug/enforce in production environment. ql/src/test/queries/clientpositive/skewjoin_explict.q:24 Can you add some tests which select some columns: select .. from (subq1 involving skewed join) s1 join (subq2 involving skewed join) s2 on join; Add some tests with auto-convert join to true (both where the map-join is picked and the map-join is not picked). Ideally, for map-join, the skew should matter. ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java:51 Is this even being called ? REVISION DETAIL https://reviews.facebook.net/D4287 To: JIRA, navis Cc: njain
        Hide
        Namit Jain added a comment -

        comments

        Show
        Namit Jain added a comment - comments
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".
        Reviewers: JIRA

        Addressed comments

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        AFFECTED FILES
        common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
        ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q
        ql/src/test/queries/clientpositive/skewjoin_explict.q
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out
        ql/src/test/results/clientpositive/skewjoin_explict.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Reviewers: JIRA Addressed comments REVISION DETAIL https://reviews.facebook.net/D4287 AFFECTED FILES common/src/java/org/apache/hadoop/hive/conf/HiveConf.java ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java ql/src/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/ReduceSinkDeDuplication.java ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q ql/src/test/queries/clientpositive/skewjoin_explict.q ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out ql/src/test/results/clientpositive/skewjoin_explict.q.out To: JIRA, navis Cc: njain
        Hide
        Shreepadma Venugopalan added a comment -

        HIVE-3526 covers the task of computing and persisting histograms on numeric columns in Hive tables and partitions.

        Show
        Shreepadma Venugopalan added a comment - HIVE-3526 covers the task of computing and persisting histograms on numeric columns in Hive tables and partitions.
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        Rebased to trunk

        Reviewers: JIRA

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        CHANGE SINCE LAST DIFF
        https://reviews.facebook.net/D4287?vs=25041&id=38511#toc

        AFFECTED FILES
        common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
        ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q
        ql/src/test/queries/clientpositive/skewjoin_explict.q
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out
        ql/src/test/results/clientpositive/skewjoin_explict.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Rebased to trunk Reviewers: JIRA REVISION DETAIL https://reviews.facebook.net/D4287 CHANGE SINCE LAST DIFF https://reviews.facebook.net/D4287?vs=25041&id=38511#toc AFFECTED FILES common/src/java/org/apache/hadoop/hive/conf/HiveConf.java ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java ql/src/java/org/apache/hadoop/hive/ql/plan/MapredWork.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q ql/src/test/queries/clientpositive/skewjoin_explict.q ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out ql/src/test/results/clientpositive/skewjoin_explict.q.out To: JIRA, navis Cc: njain
        Hide
        Phabricator added a comment -

        navis updated the revision "HIVE-3286 [jira] Explicit skew join on user provided condition".

        Rebased to trunk

        Reviewers: JIRA

        REVISION DETAIL
        https://reviews.facebook.net/D4287

        CHANGE SINCE LAST DIFF
        https://reviews.facebook.net/D4287?vs=38511&id=44265#toc

        AFFECTED FILES
        common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
        ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
        ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorReduceSinkOperator.java
        ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
        ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q
        ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q
        ql/src/test/queries/clientpositive/skewjoin_explicit.q
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out
        ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out
        ql/src/test/results/clientpositive/skewjoin_explicit.q.out

        To: JIRA, navis
        Cc: njain

        Show
        Phabricator added a comment - navis updated the revision " HIVE-3286 [jira] Explicit skew join on user provided condition". Rebased to trunk Reviewers: JIRA REVISION DETAIL https://reviews.facebook.net/D4287 CHANGE SINCE LAST DIFF https://reviews.facebook.net/D4287?vs=38511&id=44265#toc AFFECTED FILES common/src/java/org/apache/hadoop/hive/conf/HiveConf.java ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorReduceSinkOperator.java ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q ql/src/test/queries/clientpositive/skewjoin_explicit.q ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out ql/src/test/results/clientpositive/skewjoin_explicit.q.out To: JIRA, navis Cc: njain
        Hide
        Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12615542/D4287.11.patch

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/434/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/434/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
        + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + cd /data/hive-ptest/working/
        + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-434/source-prep.txt
        + [[ false == \t\r\u\e ]]
        + mkdir -p maven ivy
        + [[ svn = \s\v\n ]]
        + [[ -n '' ]]
        + [[ -d apache-svn-trunk-source ]]
        + [[ ! -d apache-svn-trunk-source/.svn ]]
        + [[ ! -d apache-svn-trunk-source ]]
        + cd apache-svn-trunk-source
        + svn revert -R .
        Reverted 'common/src/java/org/apache/hadoop/hive/conf/HiveConf.java'
        ++ awk '{print $2}'
        ++ egrep -v '^X|^Performing status on external'
        ++ svn status --no-ignore
        + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen common/src/test/org/apache/hadoop/hive/conf/TestHiveConfRestrictList.java service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target
        + svn update
        
        Fetching external item into 'hcatalog/src/test/e2e/harness'
        External at revision 1545233.
        
        At revision 1545233.
        + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
        + patchFilePath=/data/hive-ptest/working/scratch/build.patch
        + [[ -f /data/hive-ptest/working/scratch/build.patch ]]
        + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
        + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
        Going to apply patch with: patch -p0
        patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorReduceSinkOperator.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q
        patching file ql/src/test/queries/clientpositive/skewjoin_explicit.q
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out
        patching file ql/src/test/results/clientpositive/skewjoin_explicit.q.out
        + [[ maven == \m\a\v\e\n ]]
        + rm -rf /data/hive-ptest/working/maven/org/apache/hive
        + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
        [INFO] Scanning for projects...
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Build Order:
        [INFO] 
        [INFO] Hive
        [INFO] Hive Ant Utilities
        [INFO] Hive Shims Common
        [INFO] Hive Shims 0.20
        [INFO] Hive Shims Secure Common
        [INFO] Hive Shims 0.20S
        [INFO] Hive Shims 0.23
        [INFO] Hive Shims
        [INFO] Hive Common
        [INFO] Hive Serde
        [INFO] Hive Metastore
        [INFO] Hive Query Language
        [INFO] Hive Service
        [INFO] Hive JDBC
        [INFO] Hive Beeline
        [INFO] Hive CLI
        [INFO] Hive Contrib
        [INFO] Hive HBase Handler
        [INFO] Hive HCatalog
        [INFO] Hive HCatalog Core
        [INFO] Hive HCatalog Pig Adapter
        [INFO] Hive HCatalog Server Extensions
        [INFO] Hive HCatalog Webhcat Java Client
        [INFO] Hive HCatalog Webhcat
        [INFO] Hive HCatalog HBase Storage Handler
        [INFO] Hive HWI
        [INFO] Hive ODBC
        [INFO] Hive Shims Aggregator
        [INFO] Hive TestUtils
        [INFO] Hive Packaging
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
        [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
        [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
        [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
        [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
        [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
        [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
        [WARNING] JAR will be empty - no content was marked for inclusion!
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
        [INFO] Reading assembly descriptor: src/assemble/uberjar.xml
        [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
        Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
        NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
        [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
        with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Common 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
        [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 4 resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
        [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Serde 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
        [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
        [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Metastore 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
        [INFO] 
        [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
        [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
        ANTLR Parser Generator  Version 3.4
        org/apache/hadoop/hive/metastore/parser/Filter.g
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
        [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
        [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
        DataNucleus Enhancer : Classpath
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
        >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
        >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
        >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
        >>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
        >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
        >>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
        >>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
        >>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
        >>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
        >>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
        >>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
        >>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
        >>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
        >>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
        >>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar
        >>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
        >>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
        >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        >>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
        >>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
        >>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
        >>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
        >>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
        >>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
        >>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
        >>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
        >>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
        >>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
        >>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
        >>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
        >>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
        >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar
        >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar
        >>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
        >>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
        >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar
        >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
        >>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
        >>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
        >>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
        >>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
        >>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar
        >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar
        >>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
        >>  /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar
        >>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
        >>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
        >>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
        >>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
        >>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
        >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
        >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
        >>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
        >>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
        >>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
        >>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
        >>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
        >>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
        >>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
        >>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
        >>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar
        >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar
        >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
        >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
        DataNucleus Enhancer completed with success for 25 classes. Timings : input=589 ms, enhance=931 ms, total=1520 ms. Consult the log for full details
        
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
        [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Query Language 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
        Generating vector expression code
        Generating vector expression test code
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
        [INFO] 
        [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
        [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
        ANTLR Parser Generator  Version 3.4
        org/apache/hadoop/hive/ql/parse/HiveLexer.g
        org/apache/hadoop/hive/ql/parse/HiveParser.g
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:874:5: 
        Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10
        
        As a result, alternative(s) 10 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: 
        Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: 
        Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: 
        Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: 
        Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: 
        Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: 
        Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1488:116: 
        Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): SelectClauseParser.g:149:5: 
        Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): SelectClauseParser.g:149:5: 
        Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:147:2: 
        Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:25: 
        Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:25: 
        Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:25: 
        Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:108:5: 
        Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:121:5: 
        Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:133:5: 
        Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:144:5: 
        Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:155:5: 
        Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:172:7: 
        Decision can match input such as "STAR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
        
        As a result, alternative(s) 3 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:524:5: 
        Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
        
        As a result, alternative(s) 3 were disabled for that input
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
        [INFO] Compiling 1398 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol
        symbol  : method getRandom()
        location: class org.apache.hadoop.hive.ql.io.HiveKey
        [INFO] 1 error
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [4.600s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [6.658s]
        [INFO] Hive Shims Common ................................. SUCCESS [3.301s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [2.425s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [2.869s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [1.419s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [3.040s]
        [INFO] Hive Shims ........................................ SUCCESS [3.607s]
        [INFO] Hive Common ....................................... SUCCESS [6.736s]
        [INFO] Hive Serde ........................................ SUCCESS [15.748s]
        [INFO] Hive Metastore .................................... SUCCESS [25.108s]
        [INFO] Hive Query Language ............................... FAILURE [28.689s]
        [INFO] Hive Service ...................................... SKIPPED
        [INFO] Hive JDBC ......................................... SKIPPED
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 1:47.048s
        [INFO] Finished at: Mon Nov 25 06:41:27 EST 2013
        [INFO] Final Memory: 52M/379M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol
        [ERROR] symbol  : method getRandom()
        [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveKey
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-exec
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12615542

        Show
        Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12615542/D4287.11.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/434/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/434/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-434/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . Reverted 'common/src/java/org/apache/hadoop/hive/conf/HiveConf.java' ++ awk '{print $2}' ++ egrep -v '^X|^Performing status on external' ++ svn status --no-ignore + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen common/src/test/org/apache/hadoop/hive/conf/TestHiveConfRestrictList.java service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target + svn update Fetching external item into 'hcatalog/src/test/e2e/harness' External at revision 1545233. At revision 1545233. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java patching file ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorReduceSinkOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g patching file ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g patching file ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q patching file ql/src/test/queries/clientpositive/skewjoin_explicit.q patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out patching file ql/src/test/results/clientpositive/skewjoin_explicit.q.out + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant --- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure --- [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims --- [INFO] Reading assembly descriptor: src/assemble/uberjar.xml [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion. [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing. Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact. NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic! [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde --- [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde --- [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/metastore/parser/Filter.g [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore --- [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input=589 ms, enhance=931 ms, total=1520 ms. Consult the log for full details [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveParser.g warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:874:5: Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10 As a result, alternative(s) 10 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1488:116: Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7 As a result, alternative(s) 7 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:147:2: Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:25: Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:25: Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:25: Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:108:5: Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:121:5: Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:133:5: Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:144:5: Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:155:5: Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:172:7: Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:524:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1398 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol symbol : method getRandom() location: class org.apache.hadoop.hive.ql.io.HiveKey [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [4.600s] [INFO] Hive Ant Utilities ................................ SUCCESS [6.658s] [INFO] Hive Shims Common ................................. SUCCESS [3.301s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.425s] [INFO] Hive Shims Secure Common .......................... SUCCESS [2.869s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.419s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.040s] [INFO] Hive Shims ........................................ SUCCESS [3.607s] [INFO] Hive Common ....................................... SUCCESS [6.736s] [INFO] Hive Serde ........................................ SUCCESS [15.748s] [INFO] Hive Metastore .................................... SUCCESS [25.108s] [INFO] Hive Query Language ............................... FAILURE [28.689s] [INFO] Hive Service ...................................... SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:47.048s [INFO] Finished at: Mon Nov 25 06:41:27 EST 2013 [INFO] Final Memory: 52M/379M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol [ERROR] symbol : method getRandom() [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveKey [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12615542
        Hide
        Navis added a comment -

        Resubmitting to run test.

        Show
        Navis added a comment - Resubmitting to run test.
        Hide
        Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12615766/HIVE-3286.12.patch.txt

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/449/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/449/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
        + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + cd /data/hive-ptest/working/
        + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-449/source-prep.txt
        + [[ false == \t\r\u\e ]]
        + mkdir -p maven ivy
        + [[ svn = \s\v\n ]]
        + [[ -n '' ]]
        + [[ -d apache-svn-trunk-source ]]
        + [[ ! -d apache-svn-trunk-source/.svn ]]
        + [[ ! -d apache-svn-trunk-source ]]
        + cd apache-svn-trunk-source
        + svn revert -R .
        ++ awk '{print $2}'
        ++ egrep -v '^X|^Performing status on external'
        ++ svn status --no-ignore
        + rm -rf
        + svn update
        A    common/src/test/org/apache/hadoop/hive/conf/TestHiveConfRestrictList.java
        U    common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
        
        Fetching external item into 'hcatalog/src/test/e2e/harness'
        Updated external to revision 1545762.
        
        Updated to revision 1545762.
        + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
        + patchFilePath=/data/hive-ptest/working/scratch/build.patch
        + [[ -f /data/hive-ptest/working/scratch/build.patch ]]
        + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
        + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
        Going to apply patch with: patch -p0
        patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorReduceSinkOperator.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java
        patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q
        patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q
        patching file ql/src/test/queries/clientpositive/skewjoin_explicit.q
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out
        patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out
        patching file ql/src/test/results/clientpositive/skewjoin_explicit.q.out
        + [[ maven == \m\a\v\e\n ]]
        + rm -rf /data/hive-ptest/working/maven/org/apache/hive
        + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
        [INFO] Scanning for projects...
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Build Order:
        [INFO] 
        [INFO] Hive
        [INFO] Hive Ant Utilities
        [INFO] Hive Shims Common
        [INFO] Hive Shims 0.20
        [INFO] Hive Shims Secure Common
        [INFO] Hive Shims 0.20S
        [INFO] Hive Shims 0.23
        [INFO] Hive Shims
        [INFO] Hive Common
        [INFO] Hive Serde
        [INFO] Hive Metastore
        [INFO] Hive Query Language
        [INFO] Hive Service
        [INFO] Hive JDBC
        [INFO] Hive Beeline
        [INFO] Hive CLI
        [INFO] Hive Contrib
        [INFO] Hive HBase Handler
        [INFO] Hive HCatalog
        [INFO] Hive HCatalog Core
        [INFO] Hive HCatalog Pig Adapter
        [INFO] Hive HCatalog Server Extensions
        [INFO] Hive HCatalog Webhcat Java Client
        [INFO] Hive HCatalog Webhcat
        [INFO] Hive HCatalog HBase Storage Handler
        [INFO] Hive HWI
        [INFO] Hive ODBC
        [INFO] Hive Shims Aggregator
        [INFO] Hive TestUtils
        [INFO] Hive Packaging
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
        [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
        [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
        [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
        [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
        [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
        [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Shims 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
        [INFO] No sources to compile
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
        [WARNING] JAR will be empty - no content was marked for inclusion!
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
        [INFO] Reading assembly descriptor: src/assemble/uberjar.xml
        [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [INFO] META-INF/MANIFEST.MF already added, skipping
        [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
        Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
        NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
        [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
        with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Common 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
        [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 4 resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
        [INFO] Compiling 9 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Serde 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
        [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
        [INFO] Compiling 42 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Metastore 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
        [INFO] 
        [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
        [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
        ANTLR Parser Generator  Version 3.4
        org/apache/hadoop/hive/metastore/parser/Filter.g
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
        [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
        [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
        DataNucleus Enhancer : Classpath
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
        >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
        >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
        >>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
        >>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
        >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
        >>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
        >>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
        >>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
        >>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
        >>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
        >>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
        >>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
        >>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
        >>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
        >>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar
        >>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
        >>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
        >>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
        >>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
        >>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
        >>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
        >>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
        >>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
        >>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
        >>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
        >>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
        >>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
        >>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
        >>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
        >>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
        >>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
        >>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
        >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar
        >>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar
        >>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
        >>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
        >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar
        >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
        >>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
        >>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
        >>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
        >>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
        >>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar
        >>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar
        >>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
        >>  /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar
        >>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
        >>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
        >>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
        >>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
        >>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
        >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
        >>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
        >>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
        >>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
        >>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
        >>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
        >>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
        >>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
        >>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
        >>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
        >>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
        >>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
        >>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar
        >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar
        >>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
        >>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
        ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
        DataNucleus Enhancer completed with success for 25 classes. Timings : input=642 ms, enhance=961 ms, total=1603 ms. Consult the log for full details
        
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
        [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Query Language 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
        Generating vector expression code
        Generating vector expression test code
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
        [INFO] 
        [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
        [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
        ANTLR Parser Generator  Version 3.4
        org/apache/hadoop/hive/ql/parse/HiveLexer.g
        org/apache/hadoop/hive/ql/parse/HiveParser.g
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:874:5: 
        Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10
        
        As a result, alternative(s) 10 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: 
        Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: 
        Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: 
        Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: 
        Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: 
        Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: 
        Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: 
        Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: 
        Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4
        
        As a result, alternative(s) 4 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1488:116: 
        Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: 
        Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7
        
        As a result, alternative(s) 7 were disabled for that input
        warning(200): SelectClauseParser.g:149:5: 
        Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): SelectClauseParser.g:149:5: 
        Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:147:2: 
        Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:25: 
        Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:25: 
        Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:25: 
        Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:199:68: 
        Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): FromClauseParser.g:257:16: 
        Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:108:5: 
        Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:121:5: 
        Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:133:5: 
        Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:144:5: 
        Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:155:5: 
        Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:172:7: 
        Decision can match input such as "STAR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
        
        As a result, alternative(s) 3 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:524:5: 
        Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
        
        As a result, alternative(s) 3 were disabled for that input
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
        [INFO] Compiling 1399 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol
        symbol  : method getRandom()
        location: class org.apache.hadoop.hive.ql.io.HiveKey
        [INFO] 1 error
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [4.754s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [7.617s]
        [INFO] Hive Shims Common ................................. SUCCESS [3.455s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [2.263s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [2.832s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [1.630s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [4.482s]
        [INFO] Hive Shims ........................................ SUCCESS [3.006s]
        [INFO] Hive Common ....................................... SUCCESS [5.342s]
        [INFO] Hive Serde ........................................ SUCCESS [11.590s]
        [INFO] Hive Metastore .................................... SUCCESS [26.085s]
        [INFO] Hive Query Language ............................... FAILURE [35.747s]
        [INFO] Hive Service ...................................... SKIPPED
        [INFO] Hive JDBC ......................................... SKIPPED
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 1:51.912s
        [INFO] Finished at: Tue Nov 26 13:27:52 EST 2013
        [INFO] Final Memory: 53M/369M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol
        [ERROR] symbol  : method getRandom()
        [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveKey
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-exec
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12615766

        Show
        Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12615766/HIVE-3286.12.patch.txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/449/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/449/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-449/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . ++ awk '{print $2}' ++ egrep -v '^X|^Performing status on external' ++ svn status --no-ignore + rm -rf + svn update A common/src/test/org/apache/hadoop/hive/conf/TestHiveConfRestrictList.java U common/src/java/org/apache/hadoop/hive/conf/HiveConf.java Fetching external item into 'hcatalog/src/test/e2e/harness' Updated external to revision 1545762. Updated to revision 1545762. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch Going to apply patch with: patch -p0 patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java patching file ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorReduceSinkOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/HiveKey.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/SkewedKeyPartitioner.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/InlineSkewJoinOptimizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/JoinReorder.java patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/Optimizer.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g patching file ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g patching file ql/src/java/org/apache/hadoop/hive/ql/parse/QBJoinTree.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/ReduceSinkDesc.java patching file ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/NumericHistogram.java patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid1.q patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid2.q patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid3.q patching file ql/src/test/queries/clientnegative/skewjoin_explicit_invalid4.q patching file ql/src/test/queries/clientpositive/skewjoin_explicit.q patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid1.q.out patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid2.q.out patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid3.q.out patching file ql/src/test/results/clientnegative/skewjoin_explicit_invalid4.q.out patching file ql/src/test/results/clientpositive/skewjoin_explicit.q.out + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant --- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common --- [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure --- [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims --- [INFO] Reading assembly descriptor: src/assemble/uberjar.xml [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion. [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing. Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact. NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic! [WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common --- [INFO] Compiling 9 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde --- [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde --- [INFO] Compiling 42 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/metastore/parser/Filter.g [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore --- [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input=642 ms, enhance=961 ms, total=1603 ms. Consult the log for full details [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added. [INFO] [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveParser.g warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:874:5: Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10 As a result, alternative(s) 10 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1179:5: Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1192:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1199:23: Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1217:29: Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1488:116: Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1611:5: Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7 As a result, alternative(s) 7 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): SelectClauseParser.g:149:5: Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:147:2: Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:25: Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:25: Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:25: Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:199:68: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:257:16: Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:108:5: Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:121:5: Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:133:5: Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:144:5: Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:155:5: Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:172:7: Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:524:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1399 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol symbol : method getRandom() location: class org.apache.hadoop.hive.ql.io.HiveKey [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [4.754s] [INFO] Hive Ant Utilities ................................ SUCCESS [7.617s] [INFO] Hive Shims Common ................................. SUCCESS [3.455s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.263s] [INFO] Hive Shims Secure Common .......................... SUCCESS [2.832s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.630s] [INFO] Hive Shims 0.23 ................................... SUCCESS [4.482s] [INFO] Hive Shims ........................................ SUCCESS [3.006s] [INFO] Hive Common ....................................... SUCCESS [5.342s] [INFO] Hive Serde ........................................ SUCCESS [11.590s] [INFO] Hive Metastore .................................... SUCCESS [26.085s] [INFO] Hive Query Language ............................... FAILURE [35.747s] [INFO] Hive Service ...................................... SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:51.912s [INFO] Finished at: Tue Nov 26 13:27:52 EST 2013 [INFO] Final Memory: 53M/369M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol [ERROR] symbol : method getRandom() [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveKey [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12615766
        Hide
        Navis added a comment -

        Rebased to trunk. Running test.

        Show
        Navis added a comment - Rebased to trunk. Running test.
        Hide
        Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12616156/HIVE-3286.13.patch.txt

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/472/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/472/console

        Messages:

        **** This message was trimmed, see log for full details ****
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:68:4: 
        Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:108:5: 
        Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:121:5: 
        Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:133:5: 
        Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:144:5: 
        Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:155:5: 
        Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:172:7: 
        Decision can match input such as "STAR" using multiple alternatives: 1, 2
        
        As a result, alternative(s) 2 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:185:5: 
        Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6
        
        As a result, alternative(s) 6 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8
        
        As a result, alternative(s) 8 were disabled for that input
        warning(200): IdentifiersParser.g:267:5: 
        Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3
        
        As a result, alternative(s) 3 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:399:5: 
        Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9
        
        As a result, alternative(s) 9 were disabled for that input
        warning(200): IdentifiersParser.g:524:5: 
        Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3
        
        As a result, alternative(s) 3 were disabled for that input
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] Copying 1 resource
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
        [INFO] Compiling 1400 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol
        symbol  : method getRandom()
        location: class org.apache.hadoop.hive.ql.io.HiveKey
        [INFO] 1 error
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [2.846s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [9.072s]
        [INFO] Hive Shims Common ................................. SUCCESS [3.459s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [2.210s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [2.711s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [1.397s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [3.679s]
        [INFO] Hive Shims ........................................ SUCCESS [3.073s]
        [INFO] Hive Common ....................................... SUCCESS [13.388s]
        [INFO] Hive Serde ........................................ SUCCESS [11.713s]
        [INFO] Hive Metastore .................................... SUCCESS [26.188s]
        [INFO] Hive Query Language ............................... FAILURE [30.091s]
        [INFO] Hive Service ...................................... SKIPPED
        [INFO] Hive JDBC ......................................... SKIPPED
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 1:52.444s
        [INFO] Finished at: Wed Nov 27 20:19:46 EST 2013
        [INFO] Final Memory: 51M/371M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol
        [ERROR] symbol  : method getRandom()
        [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveKey
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-exec
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12616156

        Show
        Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12616156/HIVE-3286.13.patch.txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/472/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/472/console Messages: **** This message was trimmed, see log for full details **** As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4: Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:108:5: Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:121:5: Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:133:5: Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:144:5: Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:155:5: Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:172:7: Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5: Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5: Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5: Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:524:5: Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec --- [INFO] Compiling 1400 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol symbol : method getRandom() location: class org.apache.hadoop.hive.ql.io.HiveKey [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [2.846s] [INFO] Hive Ant Utilities ................................ SUCCESS [9.072s] [INFO] Hive Shims Common ................................. SUCCESS [3.459s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.210s] [INFO] Hive Shims Secure Common .......................... SUCCESS [2.711s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.397s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.679s] [INFO] Hive Shims ........................................ SUCCESS [3.073s] [INFO] Hive Common ....................................... SUCCESS [13.388s] [INFO] Hive Serde ........................................ SUCCESS [11.713s] [INFO] Hive Metastore .................................... SUCCESS [26.188s] [INFO] Hive Query Language ............................... FAILURE [30.091s] [INFO] Hive Service ...................................... SKIPPED [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:52.444s [INFO] Finished at: Wed Nov 27 20:19:46 EST 2013 [INFO] Final Memory: 51M/371M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/plan/SkewContext.java:[118,49] cannot find symbol [ERROR] symbol : method getRandom() [ERROR] location: class org.apache.hadoop.hive.ql.io.HiveKey [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-exec + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12616156
        Hide
        Navis added a comment -

        Fix build fail.

        Show
        Navis added a comment - Fix build fail.
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12616193/HIVE-3286.14.patch.txt

        ERROR: -1 due to 1 failed/errored test(s), 4745 tests executed
        Failed tests:

        org.apache.hadoop.hive.ql.TestErrorMsg.testUniqueErrorCode
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/480/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/480/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 1 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12616193

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12616193/HIVE-3286.14.patch.txt ERROR: -1 due to 1 failed/errored test(s), 4745 tests executed Failed tests: org.apache.hadoop.hive.ql.TestErrorMsg.testUniqueErrorCode Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/480/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/480/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12616193
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12616208/HIVE-3286.15.patch.txt

        ERROR: -1 due to 4 failed/errored test(s), 4745 tests executed
        Failed tests:

        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid1
        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid2
        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid3
        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid4
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/482/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/482/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 4 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12616208

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12616208/HIVE-3286.15.patch.txt ERROR: -1 due to 4 failed/errored test(s), 4745 tests executed Failed tests: org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid1 org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid2 org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid3 org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid4 Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/482/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/482/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 4 tests failed This message is automatically generated. ATTACHMENT ID: 12616208
        Hide
        Navis added a comment -

        Rebased & fixed test fails

        Show
        Navis added a comment - Rebased & fixed test fails
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12618867/HIVE-3286.16.patch.txt

        ERROR: -1 due to 4 failed/errored test(s), 4790 tests executed
        Failed tests:

        org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_skewjoin_explicit
        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid1
        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid2
        org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid3
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/652/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/652/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 4 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12618867

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12618867/HIVE-3286.16.patch.txt ERROR: -1 due to 4 failed/errored test(s), 4790 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_skewjoin_explicit org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid1 org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid2 org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_skewjoin_explicit_invalid3 Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/652/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/652/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 4 tests failed This message is automatically generated. ATTACHMENT ID: 12618867
        Hide
        Hive QA added a comment -

        Overall: +1 all checks pass

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12619277/HIVE-3286.17.patch.txt

        SUCCESS: +1 4796 tests passed

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/688/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/688/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        

        This message is automatically generated.

        ATTACHMENT ID: 12619277

        Show
        Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12619277/HIVE-3286.17.patch.txt SUCCESS: +1 4796 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/688/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/688/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12619277
        Hide
        Navis added a comment -

        Rebased to trunk

        Show
        Navis added a comment - Rebased to trunk
        Hide
        Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12673046/HIVE-3286.19.patch.txt

        ERROR: -1 due to 1 failed/errored test(s), 6529 tests executed
        Failed tests:

        org.apache.hadoop.hive.ql.TestErrorMsg.testUniqueErrorCode
        

        Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1127/testReport
        Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1127/console
        Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-1127/

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 1 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12673046

        Show
        Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12673046/HIVE-3286.19.patch.txt ERROR: -1 due to 1 failed/errored test(s), 6529 tests executed Failed tests: org.apache.hadoop.hive.ql.TestErrorMsg.testUniqueErrorCode Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1127/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/1127/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-1127/ Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12673046

          People

          • Assignee:
            Navis
            Reporter:
            Navis
          • Votes:
            1 Vote for this issue
            Watchers:
            12 Start watching this issue

            Dates

            • Created:
              Updated:

              Development