Hive
  1. Hive
  2. HIVE-6405

Support append feature for HCatalog

    Details

      Description

      HCatalog currently treats all tables as "immutable" - i.e. all tables and partitions can be written to only once, and not appended. The nuances of what this means is as follows:

      • A non-partitioned table can be written to, and data in it is never updated from then on unless you drop and recreate.
      • A partitioned table may support "appending" of a sort in a manner by adding new partitions to the table, but once written, the partitions themselves cannot have any new data added to them.

      Hive, on the other hand, does allow us to "INSERT INTO" into a table, thus allowing us append semantics. There is benefit to both of these models, and so, our goal is as follows:

      a) Introduce a notion of an immutable table, wherein all tables are not immutable by default, and have this be a table property. If this property is set for a table, and we attempt to write to a table that already has data (or a partition), disallow "INSERT INTO" into it from hive. This property being set will allow hive to mimic HCatalog's current immutable-table property. (I'm going to create a separate sub-task to cover this bit, and focus on the HCatalog-side on this jira)

      b) As long as that flag is not set, HCatalog should be changed to allow appends into it as well, and not simply error out if data already exists in a table.

        Activity

        Hide
        Sushanth Sowmyan added a comment -

        Given that we added on HIVE-6465 for ql grammar work, and there is still additional work on hive and hcatalog required to make append on HCatalog work with dynamic partitioning, I'm going to treat this bug as an epic task, and create a new jira for the specific implementation of the append feature on HCatalog.

        (Also, the qa failures here are because this patch would have worked only after HIVE-6406 was merged in, which it wasn't at the time the tests ran)

        Show
        Sushanth Sowmyan added a comment - Given that we added on HIVE-6465 for ql grammar work, and there is still additional work on hive and hcatalog required to make append on HCatalog work with dynamic partitioning, I'm going to treat this bug as an epic task, and create a new jira for the specific implementation of the append feature on HCatalog. (Also, the qa failures here are because this patch would have worked only after HIVE-6406 was merged in, which it wasn't at the time the tests ran)
        Hide
        Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12629811/HIVE-6405.patch

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1418/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1418/console

        Messages:

        **** This message was trimmed, see log for full details ****
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler ---
        [INFO] Compiling 18 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-handler ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
             [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-handler ---
        [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/test-classes
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-handler ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-handler ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
             [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog/0.13.0-SNAPSHOT/hive-hcatalog-0.13.0-SNAPSHOT.pom
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-core ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-core ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-core ---
        [INFO] Compiling 147 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputFormatContainer.java:[185,15] cannot find symbol
        symbol  : method isImmutable()
        location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[573,72] cannot find symbol
        symbol  : method isDirEmpty(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path)
        location: class org.apache.hadoop.hive.metastore.MetaStoreUtils
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[737,58] cannot find symbol
        symbol  : method isImmutable()
        location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[847,60] cannot find symbol
        symbol  : method isImmutable()
        location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[848,56] cannot find symbol
        symbol  : method isImmutable()
        location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[860,26] cannot find symbol
        symbol  : method isImmutable()
        location: class org.apache.hadoop.hive.ql.metadata.Table
        [INFO] 6 errors 
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [4.504s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [8.190s]
        [INFO] Hive Shims Common ................................. SUCCESS [3.798s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [2.134s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [3.062s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [1.466s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [4.103s]
        [INFO] Hive Shims ........................................ SUCCESS [0.707s]
        [INFO] Hive Common ....................................... SUCCESS [12.528s]
        [INFO] Hive Serde ........................................ SUCCESS [8.717s]
        [INFO] Hive Metastore .................................... SUCCESS [28.691s]
        [INFO] Hive Query Language ............................... SUCCESS [1:08.445s]
        [INFO] Hive Service ...................................... SUCCESS [5.348s]
        [INFO] Hive JDBC ......................................... SUCCESS [1.966s]
        [INFO] Hive Beeline ...................................... SUCCESS [0.875s]
        [INFO] Hive CLI .......................................... SUCCESS [1.795s]
        [INFO] Hive Contrib ...................................... SUCCESS [1.654s]
        [INFO] Hive HBase Handler ................................ SUCCESS [2.747s]
        [INFO] Hive HCatalog ..................................... SUCCESS [0.346s]
        [INFO] Hive HCatalog Core ................................ FAILURE [1.736s]
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 2:46.025s
        [INFO] Finished at: Wed Feb 19 16:43:08 EST 2014
        [INFO] Final Memory: 70M/431M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-hcatalog-core: Compilation failure: Compilation failure:
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputFormatContainer.java:[185,15] cannot find symbol
        [ERROR] symbol  : method isImmutable()
        [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[573,72] cannot find symbol
        [ERROR] symbol  : method isDirEmpty(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path)
        [ERROR] location: class org.apache.hadoop.hive.metastore.MetaStoreUtils
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[737,58] cannot find symbol
        [ERROR] symbol  : method isImmutable()
        [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[847,60] cannot find symbol
        [ERROR] symbol  : method isImmutable()
        [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[848,56] cannot find symbol
        [ERROR] symbol  : method isImmutable()
        [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[860,26] cannot find symbol
        [ERROR] symbol  : method isImmutable()
        [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-hcatalog-core
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12629811

        Show
        Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12629811/HIVE-6405.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1418/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1418/console Messages: **** This message was trimmed, see log for full details **** [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler --- [INFO] Compiling 18 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handler --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hbase-handler --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-handler --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-handler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-handler --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog/0.13.0-SNAPSHOT/hive-hcatalog-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-core --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-core --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-core --- [INFO] Compiling 147 source files to /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputFormatContainer.java:[185,15] cannot find symbol symbol : method isImmutable() location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[573,72] cannot find symbol symbol : method isDirEmpty(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path) location: class org.apache.hadoop.hive.metastore.MetaStoreUtils [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[737,58] cannot find symbol symbol : method isImmutable() location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[847,60] cannot find symbol symbol : method isImmutable() location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[848,56] cannot find symbol symbol : method isImmutable() location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[860,26] cannot find symbol symbol : method isImmutable() location: class org.apache.hadoop.hive.ql.metadata.Table [INFO] 6 errors [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [4.504s] [INFO] Hive Ant Utilities ................................ SUCCESS [8.190s] [INFO] Hive Shims Common ................................. SUCCESS [3.798s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.134s] [INFO] Hive Shims Secure Common .......................... SUCCESS [3.062s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.466s] [INFO] Hive Shims 0.23 ................................... SUCCESS [4.103s] [INFO] Hive Shims ........................................ SUCCESS [0.707s] [INFO] Hive Common ....................................... SUCCESS [12.528s] [INFO] Hive Serde ........................................ SUCCESS [8.717s] [INFO] Hive Metastore .................................... SUCCESS [28.691s] [INFO] Hive Query Language ............................... SUCCESS [1:08.445s] [INFO] Hive Service ...................................... SUCCESS [5.348s] [INFO] Hive JDBC ......................................... SUCCESS [1.966s] [INFO] Hive Beeline ...................................... SUCCESS [0.875s] [INFO] Hive CLI .......................................... SUCCESS [1.795s] [INFO] Hive Contrib ...................................... SUCCESS [1.654s] [INFO] Hive HBase Handler ................................ SUCCESS [2.747s] [INFO] Hive HCatalog ..................................... SUCCESS [0.346s] [INFO] Hive HCatalog Core ................................ FAILURE [1.736s] [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:46.025s [INFO] Finished at: Wed Feb 19 16:43:08 EST 2014 [INFO] Final Memory: 70M/431M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-hcatalog-core: Compilation failure: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputFormatContainer.java:[185,15] cannot find symbol [ERROR] symbol : method isImmutable() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[573,72] cannot find symbol [ERROR] symbol : method isDirEmpty(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path) [ERROR] location: class org.apache.hadoop.hive.metastore.MetaStoreUtils [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[737,58] cannot find symbol [ERROR] symbol : method isImmutable() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[847,60] cannot find symbol [ERROR] symbol : method isImmutable() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[848,56] cannot find symbol [ERROR] symbol : method isImmutable() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/mapreduce/FileOutputCommitterContainer.java:[860,26] cannot find symbol [ERROR] symbol : method isImmutable() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Table [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-hcatalog-core + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12629811
        Hide
        Sushanth Sowmyan added a comment -

        After a brief discussion with Ashutosh on the nature of HIVE-6406, I'm going to create another task to add ql grammar to support modification of the immutability property in a manner similar to existing grammar for NO_DROP/OFFLINE, so that this can be treated as another kind of data protection, and so that users will not have to deal with explicitly modifying TBLPROPERTIES.

        Show
        Sushanth Sowmyan added a comment - After a brief discussion with Ashutosh on the nature of HIVE-6406 , I'm going to create another task to add ql grammar to support modification of the immutability property in a manner similar to existing grammar for NO_DROP/OFFLINE, so that this can be treated as another kind of data protection, and so that users will not have to deal with explicitly modifying TBLPROPERTIES.
        Hide
        Sushanth Sowmyan added a comment -

        Attaching patch, this depends on HIVE-6406 being patched in.

        Show
        Sushanth Sowmyan added a comment - Attaching patch, this depends on HIVE-6406 being patched in.

          People

          • Assignee:
            Sushanth Sowmyan
            Reporter:
            Sushanth Sowmyan
          • Votes:
            1 Vote for this issue
            Watchers:
            12 Start watching this issue

            Dates

            • Created:
              Updated:

              Development