Hive
  1. Hive
  2. HIVE-6329

Support column level encryption/decryption

    Details

    • Type: New Feature New Feature
    • Status: Patch Available
    • Priority: Minor Minor
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Labels:
      None

      Description

      Receiving some requirements on encryption recently but hive is not supporting it. Before the full implementation via HIVE-5207, this might be useful for some cases.

      hive> create table encode_test(id int, name STRING, phone STRING, address STRING) 
          > ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
          > WITH SERDEPROPERTIES ('column.encode.indices'='2,3', 'column.encode.classname'='org.apache.hadoop.hive.serde2.Base64WriteOnly') STORED AS TEXTFILE;
      OK
      Time taken: 0.584 seconds
      hive> insert into table encode_test select 100,'navis','010-0000-0000','Seoul, Seocho' from src tablesample (1 rows);
      ......
      OK
      Time taken: 5.121 seconds
      hive> select * from encode_test;
      OK
      100	navis	  MDEwLTAwMDAtMDAwMA==	U2VvdWwsIFNlb2Nobw==
      Time taken: 0.078 seconds, Fetched: 1 row(s)
      hive> 
      
      1. HIVE-6329.1.patch.txt
        91 kB
        Navis
      2. HIVE-6329.2.patch.txt
        98 kB
        Navis
      3. HIVE-6329.3.patch.txt
        98 kB
        Navis
      4. HIVE-6329.4.patch.txt
        116 kB
        Navis
      5. HIVE-6329.5.patch.txt
        116 kB
        Navis
      6. HIVE-6329.6.patch.txt
        125 kB
        Navis
      7. HIVE-6329.7.patch.txt
        125 kB
        Navis
      8. HIVE-6329.8.patch.txt
        133 kB
        Navis

        Issue Links

          Activity

          Hide
          Navis added a comment -

          Running preliminary test

          Show
          Navis added a comment - Running preliminary test
          Hide
          Hive QA added a comment -

          Overall: -1 no tests executed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12625792/HIVE-6329.1.patch.txt

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1095/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1095/console

          Messages:

          **** This message was trimmed, see log for full details ****
          [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/classes
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[73,16] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[74,16] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[373,5] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[374,5] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[379,27] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[380,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[380,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[385,28] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[380,19] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[441,9] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/RCFileCat.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-cli ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf
               [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-cli ---
          [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/test-classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/org/apache/hadoop/hive/cli/TestCliDriverMethods.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-cli ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-cli ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Contrib 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-contrib ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/contrib (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-contrib ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contrib ---
          [INFO] Compiling 39 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/java/org/apache/hadoop/hive/contrib/udf/example/UDFExampleStructPrint.java uses unchecked or unsafe operations.
          [WARNING] Note: Recompile with -Xlint:unchecked for details.
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-contrib ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf
               [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-contrib ---
          [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/test-classes
          [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/org/apache/hadoop/hive/contrib/serde2/TestRegexSerDe.java uses or overrides a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-contrib ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-contrib ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler ---
          [INFO] Compiling 18 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes
          [INFO] -------------------------------------------------------------
          [WARNING] COMPILATION WARNING : 
          [INFO] -------------------------------------------------------------
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 2 warnings 
          [INFO] -------------------------------------------------------------
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR : 
          [INFO] -------------------------------------------------------------
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[184,24] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[189,41] cannot find symbol
          symbol  : method init(org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseCompositeKey.java:[95,15] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[110,12] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[119,14] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [INFO] 5 errors 
          [INFO] -------------------------------------------------------------
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive .............................................. SUCCESS [6.296s]
          [INFO] Hive Ant Utilities ................................ SUCCESS [9.563s]
          [INFO] Hive Shims Common ................................. SUCCESS [3.091s]
          [INFO] Hive Shims 0.20 ................................... SUCCESS [2.251s]
          [INFO] Hive Shims Secure Common .......................... SUCCESS [3.526s]
          [INFO] Hive Shims 0.20S .................................. SUCCESS [1.868s]
          [INFO] Hive Shims 0.23 ................................... SUCCESS [4.149s]
          [INFO] Hive Shims ........................................ SUCCESS [0.764s]
          [INFO] Hive Common ....................................... SUCCESS [8.436s]
          [INFO] Hive Serde ........................................ SUCCESS [16.084s]
          [INFO] Hive Metastore .................................... SUCCESS [25.645s]
          [INFO] Hive Query Language ............................... SUCCESS [1:08.715s]
          [INFO] Hive Service ...................................... SUCCESS [5.131s]
          [INFO] Hive JDBC ......................................... SUCCESS [2.169s]
          [INFO] Hive Beeline ...................................... SUCCESS [0.885s]
          [INFO] Hive CLI .......................................... SUCCESS [0.735s]
          [INFO] Hive Contrib ...................................... SUCCESS [1.592s]
          [INFO] Hive HBase Handler ................................ FAILURE [1.772s]
          [INFO] Hive HCatalog ..................................... SKIPPED
          [INFO] Hive HCatalog Core ................................ SKIPPED
          [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
          [INFO] Hive HCatalog Server Extensions ................... SKIPPED
          [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
          [INFO] Hive HCatalog Webhcat ............................. SKIPPED
          [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
          [INFO] Hive HWI .......................................... SKIPPED
          [INFO] Hive ODBC ......................................... SKIPPED
          [INFO] Hive Shims Aggregator ............................. SKIPPED
          [INFO] Hive TestUtils .................................... SKIPPED
          [INFO] Hive Packaging .................................... SKIPPED
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 2:46.056s
          [INFO] Finished at: Wed Jan 29 06:41:17 EST 2014
          [INFO] Final Memory: 53M/404M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-hbase-handler: Compilation failure: Compilation failure:
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[184,24] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[189,41] cannot find symbol
          [ERROR] symbol  : method init(org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseCompositeKey.java:[95,15] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[110,12] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[119,14] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int)
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-hbase-handler
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12625792

          Show
          Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12625792/HIVE-6329.1.patch.txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1095/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1095/console Messages: **** This message was trimmed, see log for full details **** [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/classes [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[73,16] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[74,16] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[373,5] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[374,5] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[379,27] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[380,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[380,52] warning: sun.misc.SignalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[385,28] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[380,19] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java:[441,9] warning: sun.misc.Signal is Sun proprietary API and may be removed in a future release [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/RCFileCat.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-cli --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trunk-source/cli/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/test/org/apache/hadoop/hive/cli/TestCliDriverMethods.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-cli --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-cli --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Contrib 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-contrib --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/contrib (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contrib --- [INFO] Compiling 39 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/java/org/apache/hadoop/hive/contrib/udf/example/UDFExampleStructPrint.java uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-contrib --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/src/test/org/apache/hadoop/hive/contrib/serde2/TestRegexSerDe.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-contrib --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-contrib --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/target/hive-contrib-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handler --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase-handler --- [INFO] Compiling 18 source files to /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] 2 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[184,24] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[189,41] cannot find symbol symbol : method init(org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseCompositeKey.java:[95,15] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[110,12] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[119,14] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [INFO] 5 errors [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [6.296s] [INFO] Hive Ant Utilities ................................ SUCCESS [9.563s] [INFO] Hive Shims Common ................................. SUCCESS [3.091s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.251s] [INFO] Hive Shims Secure Common .......................... SUCCESS [3.526s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.868s] [INFO] Hive Shims 0.23 ................................... SUCCESS [4.149s] [INFO] Hive Shims ........................................ SUCCESS [0.764s] [INFO] Hive Common ....................................... SUCCESS [8.436s] [INFO] Hive Serde ........................................ SUCCESS [16.084s] [INFO] Hive Metastore .................................... SUCCESS [25.645s] [INFO] Hive Query Language ............................... SUCCESS [1:08.715s] [INFO] Hive Service ...................................... SUCCESS [5.131s] [INFO] Hive JDBC ......................................... SUCCESS [2.169s] [INFO] Hive Beeline ...................................... SUCCESS [0.885s] [INFO] Hive CLI .......................................... SUCCESS [0.735s] [INFO] Hive Contrib ...................................... SUCCESS [1.592s] [INFO] Hive HBase Handler ................................ FAILURE [1.772s] [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:46.056s [INFO] Finished at: Wed Jan 29 06:41:17 EST 2014 [INFO] Final Memory: 53M/404M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-hbase-handler: Compilation failure: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[184,24] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseRow.java:[189,41] cannot find symbol [ERROR] symbol : method init(org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseCompositeKey.java:[95,15] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[110,12] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/hbase-handler/src/java/org/apache/hadoop/hive/hbase/LazyHBaseCellMap.java:[119,14] init(byte[],int,int) in org.apache.hadoop.hive.serde2.lazy.LazyObjectBase cannot be applied to (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef,int,int) [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-hbase-handler + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12625792
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12626580/HIVE-6329.2.patch.txt

          ERROR: -1 due to 1 failed/errored test(s), 4998 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_column_encoding
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1155/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1155/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 1 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12626580

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12626580/HIVE-6329.2.patch.txt ERROR: -1 due to 1 failed/errored test(s), 4998 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_column_encoding Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1155/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1155/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12626580
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12626591/HIVE-6329.3.patch.txt

          ERROR: -1 due to 3 failed/errored test(s), 4998 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16
          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_infer_bucket_sort_bucketed_table
          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_schemeAuthority2
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1163/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1163/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 3 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12626591

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12626591/HIVE-6329.3.patch.txt ERROR: -1 due to 3 failed/errored test(s), 4998 tests executed Failed tests: org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16 org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_infer_bucket_sort_bucketed_table org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_schemeAuthority2 Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1163/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1163/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 3 tests failed This message is automatically generated. ATTACHMENT ID: 12626591
          Hide
          Navis added a comment -

          Failures seemed not related to this.

          Show
          Navis added a comment - Failures seemed not related to this.
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12627084/HIVE-6329.4.patch.txt

          ERROR: -1 due to 2 failed/errored test(s), 5035 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_single_sourced_multi_insert
          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1209/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1209/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 2 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12627084

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12627084/HIVE-6329.4.patch.txt ERROR: -1 due to 2 failed/errored test(s), 5035 tests executed Failed tests: org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_single_sourced_multi_insert org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16 Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1209/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1209/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed This message is automatically generated. ATTACHMENT ID: 12627084
          Hide
          Navis added a comment -

          Supports encoding/decoding for HBaseSerDe

          Show
          Navis added a comment - Supports encoding/decoding for HBaseSerDe
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12627524/HIVE-6329.5.patch.txt

          ERROR: -1 due to 1 failed/errored test(s), 5043 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_single_sourced_multi_insert
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1232/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1232/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 1 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12627524

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12627524/HIVE-6329.5.patch.txt ERROR: -1 due to 1 failed/errored test(s), 5043 tests executed Failed tests: org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_single_sourced_multi_insert Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1232/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1232/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12627524
          Hide
          Owen O'Malley added a comment -

          How are keys managed by your patch, Navis?

          Over in Hadoop, we've been working on creating a key management api. It was originally added in HADOOP-10141.

          The critical parts of key management for Hadoop are:

          • support third part key management solutions
          • including distributed or local key management
          • keys must be versioned so that you can roll new versions of keys

          You can see the API here:

          http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/key/KeyProvider.java

          Show
          Owen O'Malley added a comment - How are keys managed by your patch, Navis? Over in Hadoop, we've been working on creating a key management api. It was originally added in HADOOP-10141 . The critical parts of key management for Hadoop are: support third part key management solutions including distributed or local key management keys must be versioned so that you can roll new versions of keys You can see the API here: http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/key/KeyProvider.java
          Hide
          Owen O'Malley added a comment -

          There are effectively two parts of information that you need for encryption:

          • a secret key
          • an initialization vector (IV) to prevent dictionary attacks

          I'd propose that we standardize on something like:

          CREATE TABLE encode_test (
             id INT,
             name STRING,
             phone STRING,
             address STRING) 
          ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazyEncryptingSerDe' 
          WITH SERDEPROPERTIES
               ('column.encrypt.1.key'='pii-key', 'column.encrypt.1.iv'='12345', column.encrypt.2.key'='supersecret', 'column.encrypt.2.iv'='3141')
           STORED AS TEXTFILE;
          
          Show
          Owen O'Malley added a comment - There are effectively two parts of information that you need for encryption: a secret key an initialization vector (IV) to prevent dictionary attacks I'd propose that we standardize on something like: CREATE TABLE encode_test ( id INT, name STRING, phone STRING, address STRING) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazyEncryptingSerDe' WITH SERDEPROPERTIES ('column.encrypt.1.key'='pii-key', 'column.encrypt.1.iv'='12345', column.encrypt.2.key'='supersecret', 'column.encrypt.2.iv'='3141') STORED AS TEXTFILE;
          Hide
          Larry McCay added a comment -

          Owen O'Malley I certainly agree that we need the IV but I'm not sure that I like it in the DDL.

          Show
          Larry McCay added a comment - Owen O'Malley I certainly agree that we need the IV but I'm not sure that I like it in the DDL.
          Hide
          Owen O'Malley added a comment -

          Fair enough, Larry. I guess it would be better to pick an IV per a row and encode it in the row itself. What about something like:

          <iv>^a<col 0>^a<base64 encoded encrypt col 1>^a<base64 encoded encrypted col 2>^a<col 3>\n
          

          The iv could just be picked sequentially from a random number.

          Show
          Owen O'Malley added a comment - Fair enough, Larry. I guess it would be better to pick an IV per a row and encode it in the row itself. What about something like: <iv>^a<col 0>^a<base64 encoded encrypt col 1>^a<base64 encoded encrypted col 2>^a<col 3>\n The iv could just be picked sequentially from a random number.
          Hide
          Larry McCay added a comment -

          That works.
          We would need to be able to determine that a particular row has no IV as well.
          This could be done by some well known constant or the size of the IV?
          A size of 0 would indicate that there is no encryption in the row.

          Show
          Larry McCay added a comment - That works. We would need to be able to determine that a particular row has no IV as well. This could be done by some well known constant or the size of the IV? A size of 0 would indicate that there is no encryption in the row.
          Hide
          Remus Rusanu added a comment -

          I think IV per row still leaks information between columns. For instance I can tell if col1 and col2 have the same content in a row. In SQL Server for example the ENCRYPTBYKEY functions generate new IV and store it for each individual value encrypted, ie. each column in each row.

          Show
          Remus Rusanu added a comment - I think IV per row still leaks information between columns. For instance I can tell if col1 and col2 have the same content in a row. In SQL Server for example the ENCRYPTBYKEY functions generate new IV and store it for each individual value encrypted, ie. each column in each row.
          Hide
          Larry McCay added a comment -

          I think it is important to understand that there is a separate key per col

          • therefore you wouldn't have the same cipher text for same clear text.
          Show
          Larry McCay added a comment - I think it is important to understand that there is a separate key per col therefore you wouldn't have the same cipher text for same clear text.
          Hide
          Remus Rusanu added a comment -

          Larry McCay Sry, did not know that. I withdraw my objection.

          Show
          Remus Rusanu added a comment - Larry McCay Sry, did not know that. I withdraw my objection.
          Hide
          Navis added a comment -

          I think any possible methodologies can be used here by implementing FieldRewriter (I'm not the fan of this name), with proper init method (hand over serde properties, etc.). For us, still in experimental stage, we uses same IV for each table. And with the OTP fingerprint generated in hiveserver hook, each tasks acquires private key for decryption via SSL. I'm not security guy so cannot confirm that this is right direction.

          Show
          Navis added a comment - I think any possible methodologies can be used here by implementing FieldRewriter (I'm not the fan of this name), with proper init method (hand over serde properties, etc.). For us, still in experimental stage, we uses same IV for each table. And with the OTP fingerprint generated in hiveserver hook, each tasks acquires private key for decryption via SSL. I'm not security guy so cannot confirm that this is right direction.
          Hide
          Navis added a comment -

          Supposed to be,

          public interface FieldRewriter {
            void init(List<String> columnNames, List<TypeInfo> columnTypes, Properties properties) throws IOException;
            void encode(int index, ByteStream.Input input, ByteStream.Output output) throws IOException;
            void decode(int index, ByteStream.Input input, ByteStream.Output output) throws IOException;
          }
          
          Show
          Navis added a comment - Supposed to be, public interface FieldRewriter { void init(List< String > columnNames, List<TypeInfo> columnTypes, Properties properties) throws IOException; void encode( int index, ByteStream.Input input, ByteStream.Output output) throws IOException; void decode( int index, ByteStream.Input input, ByteStream.Output output) throws IOException; }
          Hide
          Brock Noland added a comment -

          Hi,

          It looks like this makes some changes to the init() method? I think this will impact existing Hive Serdes. Is it possible to make this change without changing the init() method?

          Show
          Brock Noland added a comment - Hi, It looks like this makes some changes to the init() method? I think this will impact existing Hive Serdes. Is it possible to make this change without changing the init() method?
          Hide
          Navis added a comment -

          I've mistakenly replied to dev-mail. Writing here again.

          Yes, the patch removed ByteArrayRef from LazyObjectBase, which seemed just useless overhead. Can we just remove it? I think the impact is restricted to inside of hive.

          Show
          Navis added a comment - I've mistakenly replied to dev-mail. Writing here again. Yes, the patch removed ByteArrayRef from LazyObjectBase, which seemed just useless overhead. Can we just remove it? I think the impact is restricted to inside of hive.
          Hide
          Navis added a comment -

          Fixed index error & rebased on trunk.

          Show
          Navis added a comment - Fixed index error & rebased on trunk.
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12630251/HIVE-6329.6.patch.txt

          ERROR: -1 due to 2 failed/errored test(s), 5178 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_column_encoding
          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_root_dir_external_table
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1459/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1459/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 2 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12630251

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12630251/HIVE-6329.6.patch.txt ERROR: -1 due to 2 failed/errored test(s), 5178 tests executed Failed tests: org.apache.hadoop.hive.cli.TestHBaseCliDriver.testCliDriver_hbase_column_encoding org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_root_dir_external_table Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1459/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1459/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed This message is automatically generated. ATTACHMENT ID: 12630251
          Hide
          Navis added a comment -

          Strange.. I cannot reproduce fail of hbase_column_encoding.q

          Show
          Navis added a comment - Strange.. I cannot reproduce fail of hbase_column_encoding.q
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12630599/HIVE-6329.7.patch.txt

          ERROR: -1 due to 3 failed/errored test(s), 5181 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16
          org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_bucketmapjoin6
          org.apache.hive.service.cli.TestEmbeddedThriftBinaryCLIService.testExecuteStatementAsync
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1486/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1486/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 3 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12630599

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12630599/HIVE-6329.7.patch.txt ERROR: -1 due to 3 failed/errored test(s), 5181 tests executed Failed tests: org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_auto_sortmerge_join_16 org.apache.hadoop.hive.cli.TestMinimrCliDriver.testCliDriver_bucketmapjoin6 org.apache.hive.service.cli.TestEmbeddedThriftBinaryCLIService.testExecuteStatementAsync Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1486/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1486/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 3 tests failed This message is automatically generated. ATTACHMENT ID: 12630599
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12637978/HIVE-6329.8.patch.txt

          SUCCESS: +1 5515 tests passed

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/2064/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/2064/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          ATTACHMENT ID: 12637978

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12637978/HIVE-6329.8.patch.txt SUCCESS: +1 5515 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/2064/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/2064/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12637978

            People

            • Assignee:
              Navis
              Reporter:
              Navis
            • Votes:
              1 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

              • Created:
                Updated:

                Development