Hive
  1. Hive
  2. HIVE-6185

DDLTask is inconsistent in creating a table and adding a partition when dealing with location

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.12.0
    • Fix Version/s: 0.13.0
    • Component/s: Query Processor
    • Labels:
      None

      Description

      When creating a table, Hive uses URI to represent location:

          if (crtTbl.getLocation() != null) {
            tbl.setDataLocation(new Path(crtTbl.getLocation()).toUri());
          }
      

      When adding a partition, Hive uses Path to represent location:

            // set partition path relative to table
            db.createPartition(tbl, addPartitionDesc.getPartSpec(), new Path(tbl
                          .getPath(), addPartitionDesc.getLocation()), addPartitionDesc.getPartParams(),
                          addPartitionDesc.getInputFormat(),
                          addPartitionDesc.getOutputFormat(),
                          addPartitionDesc.getNumBuckets(),
                          addPartitionDesc.getCols(),
                          addPartitionDesc.getSerializationLib(),
                          addPartitionDesc.getSerdeParams(),
                          addPartitionDesc.getBucketCols(),
                          addPartitionDesc.getSortCols());
      

      This disparity makes the values stored in metastore be encoded differently, causing problems w.r.t. special character as demonstrated in HIVE-5446. As a result, the code dealing with location for table is different for partition, creating maintenance burden.

      We need to standardize it to Path to be in line with other Path related cleanup effort.

      1. HIVE-6185.3.patch
        42 kB
        Xuefu Zhang
      2. HIVE-6185.2.patch
        42 kB
        Xuefu Zhang
      3. HIVE-6185.1.patch
        41 kB
        Xuefu Zhang
      4. HIVE-6185.patch
        40 kB
        Xuefu Zhang
      5. HIVE-6185.patch
        25 kB
        Xuefu Zhang

        Issue Links

          Activity

          Hide
          Hive QA added a comment -

          Overall: -1 no tests executed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12622492/HIVE-6185.patch

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/863/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/863/console

          Messages:

          **** This message was trimmed, see log for full details ****
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf
               [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it/0.13.0-SNAPSHOT/hive-it-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Integration - Custom Serde 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-custom-serde ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-custom-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-custom-serde ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-custom-serde ---
          [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/classes
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-it-custom-serde ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-custom-serde ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf
               [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-it-custom-serde ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-it-custom-serde ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-custom-serde ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it-custom-serde ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.pom
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Integration - HCatalog Unit Tests 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-it-unit ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-it-unit ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-it-unit ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-it-unit ---
          [INFO] No sources to compile
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-it-unit ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/src/test/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-it-unit ---
          [INFO] Executing tasks
          
          main:
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/warehouse
              [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp/conf
               [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp/conf
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-it-unit ---
          [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/test-classes
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 
          [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-it-unit ---
          [INFO] Tests are skipped.
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-it-unit ---
          [WARNING] JAR will be empty - no content was marked for inclusion!
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar
          [INFO] 
          [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hcatalog-it-unit ---
          [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar
          [INFO] 
          [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-it-unit ---
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.pom
          [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar
          [INFO]                                                                         
          [INFO] ------------------------------------------------------------------------
          [INFO] Building Hive Integration - Testing Utilities 0.13.0-SNAPSHOT
          [INFO] ------------------------------------------------------------------------
          [INFO] 
          [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-util ---
          [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/util (includes = [datanucleus.log, derby.log], excludes = [])
          [INFO] 
          [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-util ---
          [debug] execute contextualize
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/resources
          [INFO] 
          [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-util ---
          [INFO] Executing tasks
          
          main:
          [INFO] Executed tasks
          [INFO] 
          [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-util ---
          [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/util/target/classes
          [INFO] -------------------------------------------------------------
          [WARNING] COMPILATION WARNING : 
          [INFO] -------------------------------------------------------------
          [WARNING] Note: Some input files use or override a deprecated API.
          [WARNING] Note: Recompile with -Xlint:deprecation for details.
          [INFO] 2 warnings 
          [INFO] -------------------------------------------------------------
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR : 
          [INFO] -------------------------------------------------------------
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java:[47,18] cannot find symbol
          symbol  : method getPartitionPath()
          location: class org.apache.hadoop.hive.ql.metadata.Partition
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java:[46,18] cannot find symbol
          symbol  : method getPartitionPath()
          location: class org.apache.hadoop.hive.ql.metadata.Partition
          [INFO] 2 errors 
          [INFO] -------------------------------------------------------------
          [INFO] ------------------------------------------------------------------------
          [INFO] Reactor Summary:
          [INFO] 
          [INFO] Hive Integration - Parent ......................... SUCCESS [4.620s]
          [INFO] Hive Integration - Custom Serde ................... SUCCESS [11.056s]
          [INFO] Hive Integration - HCatalog Unit Tests ............ SUCCESS [5.939s]
          [INFO] Hive Integration - Testing Utilities .............. FAILURE [3.795s]
          [INFO] Hive Integration - Unit Tests ..................... SKIPPED
          [INFO] Hive Integration - Test Serde ..................... SKIPPED
          [INFO] Hive Integration - QFile Tests .................... SKIPPED
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD FAILURE
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 27.238s
          [INFO] Finished at: Sat Jan 11 09:30:55 EST 2014
          [INFO] Final Memory: 28M/85M
          [INFO] ------------------------------------------------------------------------
          [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-it-util: Compilation failure: Compilation failure:
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java:[47,18] cannot find symbol
          [ERROR] symbol  : method getPartitionPath()
          [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Partition
          [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java:[46,18] cannot find symbol
          [ERROR] symbol  : method getPartitionPath()
          [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Partition
          [ERROR] -> [Help 1]
          [ERROR] 
          [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
          [ERROR] Re-run Maven using the -X switch to enable full debug logging.
          [ERROR] 
          [ERROR] For more information about the errors and possible solutions, please read the following articles:
          [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
          [ERROR] 
          [ERROR] After correcting the problems, you can resume the build with the command
          [ERROR]   mvn <goals> -rf :hive-it-util
          + exit 1
          '
          

          This message is automatically generated.

          ATTACHMENT ID: 12622492

          Show
          Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12622492/HIVE-6185.patch Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/863/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/863/console Messages: **** This message was trimmed, see log for full details **** [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it/0.13.0-SNAPSHOT/hive-it-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - Custom Serde 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-custom-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-custom-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-custom-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-custom-serde --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-it-custom-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-custom-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-it-custom-serde --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-it-custom-serde --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-custom-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it-custom-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.pom [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - HCatalog Unit Tests 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-it-unit --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-hcatalog-it-unit --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-it-unit --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcatalog-it-unit --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-hcatalog-it-unit --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-it-unit --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp/conf [copy] Copying 5 files to /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-hcatalog-it-unit --- [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-it-unit --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-it-unit --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hcatalog-it-unit --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatalog-it-unit --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Integration - Testing Utilities 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-util --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/util (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-it-util --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-util --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-util --- [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/itests/util/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] 2 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java:[47,18] cannot find symbol symbol : method getPartitionPath() location: class org.apache.hadoop.hive.ql.metadata.Partition [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java:[46,18] cannot find symbol symbol : method getPartitionPath() location: class org.apache.hadoop.hive.ql.metadata.Partition [INFO] 2 errors [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive Integration - Parent ......................... SUCCESS [4.620s] [INFO] Hive Integration - Custom Serde ................... SUCCESS [11.056s] [INFO] Hive Integration - HCatalog Unit Tests ............ SUCCESS [5.939s] [INFO] Hive Integration - Testing Utilities .............. FAILURE [3.795s] [INFO] Hive Integration - Unit Tests ..................... SKIPPED [INFO] Hive Integration - Test Serde ..................... SKIPPED [INFO] Hive Integration - QFile Tests .................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 27.238s [INFO] Finished at: Sat Jan 11 09:30:55 EST 2014 [INFO] Final Memory: 28M/85M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-it-util: Compilation failure: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java:[47,18] cannot find symbol [ERROR] symbol : method getPartitionPath() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Partition [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java:[46,18] cannot find symbol [ERROR] symbol : method getPartitionPath() [ERROR] location: class org.apache.hadoop.hive.ql.metadata.Partition [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-it-util + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12622492
          Hide
          Hive QA added a comment -

          Overall: -1 at least one tests failed

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12622513/HIVE-6185.1.patch

          ERROR: -1 due to 1 failed/errored test(s), 4917 tests executed
          Failed tests:

          org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_14_managed_location_over_existing
          

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/866/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/866/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          Tests exited with: TestsFailedException: 1 tests failed
          

          This message is automatically generated.

          ATTACHMENT ID: 12622513

          Show
          Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12622513/HIVE-6185.1.patch ERROR: -1 due to 1 failed/errored test(s), 4917 tests executed Failed tests: org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_exim_14_managed_location_over_existing Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/866/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/866/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12622513
          Hide
          Xuefu Zhang added a comment -

          Patch #2 fixed the above test failure.

          Show
          Xuefu Zhang added a comment - Patch #2 fixed the above test failure.
          Hide
          Xuefu Zhang added a comment -
          Show
          Xuefu Zhang added a comment - RB: https://reviews.apache.org/r/16806/
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12622525/HIVE-6185.2.patch

          SUCCESS: +1 4917 tests passed

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/871/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/871/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          ATTACHMENT ID: 12622525

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12622525/HIVE-6185.2.patch SUCCESS: +1 4917 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/871/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/871/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12622525
          Hide
          Mohammad Kamrul Islam added a comment -

          Patch looks good!
          Few comments:
          1. In Partition::setBucketCount(),
          FileSystem fs = FileSystem.get(getDataLocation().toUri(), Hive.get().getConf())
          can be rewritten as (to make it consistent for other places):
          FileSystem fs = getDataLocation().getFileSystem(Hive.get().getConf());

          2. Same thing in SamplePruner:: limitPrune()
          FileSystem fs = FileSystem.get(part.getDataLocation().toUri(), Hive.get() .getConf());
          can be rewritten as
          FileSystem fs = part.getDataLocation().getFileSystem(Hive.get().getConf());

          3. In Partition.java

          A new method "public Path getDataLocation() " is introduced. Is it replacing "public Path getPartitionPath() " or "final public URI getDataLocation()"? If it is the later one, do we need to keep the "final" modifier?

          Show
          Mohammad Kamrul Islam added a comment - Patch looks good! Few comments: 1. In Partition::setBucketCount(), FileSystem fs = FileSystem.get(getDataLocation().toUri(), Hive.get().getConf()) can be rewritten as (to make it consistent for other places): FileSystem fs = getDataLocation().getFileSystem(Hive.get().getConf()); 2. Same thing in SamplePruner:: limitPrune() FileSystem fs = FileSystem.get(part.getDataLocation().toUri(), Hive.get() .getConf()); can be rewritten as FileSystem fs = part.getDataLocation().getFileSystem(Hive.get().getConf()); 3. In Partition.java A new method "public Path getDataLocation() " is introduced. Is it replacing "public Path getPartitionPath() " or "final public URI getDataLocation()"? If it is the later one, do we need to keep the "final" modifier?
          Hide
          Ashutosh Chauhan added a comment -

          +1, left a minor comment on RB.

          Show
          Ashutosh Chauhan added a comment - +1, left a minor comment on RB.
          Hide
          Xuefu Zhang added a comment -

          Patch #3 incorporated the review feedback.

          Show
          Xuefu Zhang added a comment - Patch #3 incorporated the review feedback.
          Hide
          Hive QA added a comment -

          Overall: +1 all checks pass

          Here are the results of testing the latest attachment:
          https://issues.apache.org/jira/secure/attachment/12622690/HIVE-6185.3.patch

          SUCCESS: +1 4924 tests passed

          Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/889/testReport
          Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/889/console

          Messages:

          Executing org.apache.hive.ptest.execution.PrepPhase
          Executing org.apache.hive.ptest.execution.ExecutionPhase
          Executing org.apache.hive.ptest.execution.ReportingPhase
          

          This message is automatically generated.

          ATTACHMENT ID: 12622690

          Show
          Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12622690/HIVE-6185.3.patch SUCCESS: +1 4924 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/889/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/889/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12622690
          Hide
          Ashutosh Chauhan added a comment -

          Committed to trunk. Thanks, Xuefu!

          Show
          Ashutosh Chauhan added a comment - Committed to trunk. Thanks, Xuefu!

            People

            • Assignee:
              Xuefu Zhang
              Reporter:
              Xuefu Zhang
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development