Details

    1. HIVE-3746.7.patch.txt
      430 kB
      Navis
    2. HIVE-3746.6.patch.txt
      430 kB
      Navis
    3. HIVE-3746.5.patch.txt
      430 kB
      Navis
    4. HIVE-3746.4.patch.txt
      506 kB
      Navis
    5. HIVE-3746.3.patch.txt
      408 kB
      Navis
    6. HIVE-3746.2.patch.txt
      408 kB
      Navis
    7. HIVE-3746.1.patch.txt
      407 kB
      Navis

      Issue Links

        Activity

        Hide
        zhuyu4839 zhuyu added a comment -

        how to solve it?
        main:
        [INFO] Executed tasks
        [INFO]
        [INFO] — maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec —
        [INFO] Compiling 7 source files to /mnt/public/workspace/linux/hive-0.13/ql/target/classes
        [INFO]
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Service 0.13.0
        [INFO] ------------------------------------------------------------------------
        Downloading: http://www.datanucleus.org/downloads/maven2/org/apache/hive/hive-exec/0.13.0/hive-exec-0.13.0-tests.jar
        Downloading: http://repo.maven.apache.org/maven2/org/apache/hive/hive-exec/0.13.0/hive-exec-0.13.0-tests.jar
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO]
        [INFO] Hive .............................................. SUCCESS [2.648s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [1.184s]
        [INFO] Hive Shims Common ................................. SUCCESS [1.419s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [0.739s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [1.021s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [0.796s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [2.539s]
        [INFO] Hive Shims ........................................ SUCCESS [0.415s]
        [INFO] Hive Common ....................................... SUCCESS [4.014s]
        [INFO] Hive Serde ........................................ SUCCESS [1.722s]
        [INFO] Hive Metastore .................................... SUCCESS [2.513s]
        [INFO] Hive Query Language ............................... SUCCESS [7.495s]
        [INFO] Hive Service ...................................... FAILURE [2.921s]
        [INFO] Hive JDBC ......................................... SKIPPED
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 31.330s
        [INFO] Finished at: Mon Mar 17 00:05:57 HKT 2014
        [INFO] Final Memory: 41M/236M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal on project hive-service: Could not resolve dependencies for project org.apache.hive:hive-service:jar:0.13.0: Could not find artifact org.apache.hive:hive-exec:jar:tests:0.13.0 in datanucleus (http://www.datanucleus.org/downloads/maven2) -> [Help 1]
        [ERROR]
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR]
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
        [ERROR]
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR] mvn <goals> -rf :hive-service

        Show
        zhuyu4839 zhuyu added a comment - how to solve it? main: [INFO] Executed tasks [INFO] [INFO] — maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec — [INFO] Compiling 7 source files to /mnt/public/workspace/linux/hive-0.13/ql/target/classes [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Service 0.13.0 [INFO] ------------------------------------------------------------------------ Downloading: http://www.datanucleus.org/downloads/maven2/org/apache/hive/hive-exec/0.13.0/hive-exec-0.13.0-tests.jar Downloading: http://repo.maven.apache.org/maven2/org/apache/hive/hive-exec/0.13.0/hive-exec-0.13.0-tests.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [2.648s] [INFO] Hive Ant Utilities ................................ SUCCESS [1.184s] [INFO] Hive Shims Common ................................. SUCCESS [1.419s] [INFO] Hive Shims 0.20 ................................... SUCCESS [0.739s] [INFO] Hive Shims Secure Common .......................... SUCCESS [1.021s] [INFO] Hive Shims 0.20S .................................. SUCCESS [0.796s] [INFO] Hive Shims 0.23 ................................... SUCCESS [2.539s] [INFO] Hive Shims ........................................ SUCCESS [0.415s] [INFO] Hive Common ....................................... SUCCESS [4.014s] [INFO] Hive Serde ........................................ SUCCESS [1.722s] [INFO] Hive Metastore .................................... SUCCESS [2.513s] [INFO] Hive Query Language ............................... SUCCESS [7.495s] [INFO] Hive Service ...................................... FAILURE [2.921s] [INFO] Hive JDBC ......................................... SKIPPED [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 31.330s [INFO] Finished at: Mon Mar 17 00:05:57 HKT 2014 [INFO] Final Memory: 41M/236M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal on project hive-service: Could not resolve dependencies for project org.apache.hive:hive-service:jar:0.13.0: Could not find artifact org.apache.hive:hive-exec:jar:tests:0.13.0 in datanucleus ( http://www.datanucleus.org/downloads/maven2 ) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-service
        Hide
        brodym Brody Messmer added a comment -

        I'm currently out of the office but will be checking email. I'll be back in the office on Monday, January 13th. For any urgent issues, please contact Jim Silhavy.

        Thanks,
        Brody

        Show
        brodym Brody Messmer added a comment - I'm currently out of the office but will be checking email. I'll be back in the office on Monday, January 13th. For any urgent issues, please contact Jim Silhavy. Thanks, Brody
        Hide
        navis Navis added a comment - - edited

        [~vaibhavgumashta] This is rather new patch and not applied even to our product version, so I don't have any numbers on performance. Sorry. Jay Bennett Thanks for the reassuring comment.

        Show
        navis Navis added a comment - - edited [~vaibhavgumashta] This is rather new patch and not applied even to our product version, so I don't have any numbers on performance. Sorry. Jay Bennett Thanks for the reassuring comment.
        Hide
        simba_jayb Jay Bennett added a comment -

        Brock Noland Follow-on JIRA has been created (HIVE-6160)
        [~vaibhavgumashta] We have run some preliminary performance numbers, it looks to be at least x2.5 improvement for fetch time. Stay tuned for more detailed results.

        Show
        simba_jayb Jay Bennett added a comment - Brock Noland Follow-on JIRA has been created ( HIVE-6160 ) [~vaibhavgumashta] We have run some preliminary performance numbers, it looks to be at least x2.5 improvement for fetch time. Stay tuned for more detailed results.
        Hide
        vgumashta Vaibhav Gumashta added a comment -

        Navis Thanks so much for the patch! I was also curious if there was any performance instrumentation done on your side?

        Show
        vgumashta Vaibhav Gumashta added a comment - Navis Thanks so much for the patch! I was also curious if there was any performance instrumentation done on your side?
        Hide
        cwsteinbach Carl Steinbach added a comment -

        Committed to trunk. Thanks Navis!

        Show
        cwsteinbach Carl Steinbach added a comment - Committed to trunk. Thanks Navis!
        Hide
        cwsteinbach Carl Steinbach added a comment -

        [~vaibhavgumashta] Cool!
        Navis Thanks again for putting this patch together! I'm in the process of committing it now.

        Show
        cwsteinbach Carl Steinbach added a comment - [~vaibhavgumashta] Cool! Navis Thanks again for putting this patch together! I'm in the process of committing it now.
        Hide
        vgumashta Vaibhav Gumashta added a comment -

        Carl Steinbach I can take a stab at what you pointed out (CLIService referencing Thrift). Otherwise, the patch looks good!

        Show
        vgumashta Vaibhav Gumashta added a comment - Carl Steinbach I can take a stab at what you pointed out (CLIService referencing Thrift). Otherwise, the patch looks good!
        Hide
        cwsteinbach Carl Steinbach added a comment -

        +1

        I'm fine with committing the patch in its current state, but there's one thing I think we definitely need to fix ASAP in a followup patch. Up to this point we have managed to avoid polluting the client and service class interfaces ( i.e. CLIService and CLIServiceClient) with direct references to the Thrift serialization layer. This patch breaks that rule by exposing TProtocolVersion in the public methods of CliService. Only ThriftCLIService should need to know that the client is using a specific version of the Thrift serialization layer.

        Show
        cwsteinbach Carl Steinbach added a comment - +1 I'm fine with committing the patch in its current state, but there's one thing I think we definitely need to fix ASAP in a followup patch. Up to this point we have managed to avoid polluting the client and service class interfaces ( i.e. CLIService and CLIServiceClient) with direct references to the Thrift serialization layer. This patch breaks that rule by exposing TProtocolVersion in the public methods of CliService. Only ThriftCLIService should need to know that the client is using a specific version of the Thrift serialization layer.
        Hide
        brocknoland Brock Noland added a comment -

        Hey guys, this patch is a huge improvement over the existing RS serialization and is a big page so we don't want to have Navis continually rebasing. I think we should have a follow on JIRA to:

        1) test backwards compatibility with the older driver and fix any outstanding issues
        2) remove the debug stuff that is included (printStackTrace and System.out)

        Brock

        Show
        brocknoland Brock Noland added a comment - Hey guys, this patch is a huge improvement over the existing RS serialization and is a big page so we don't want to have Navis continually rebasing. I think we should have a follow on JIRA to: 1) test backwards compatibility with the older driver and fix any outstanding issues 2) remove the debug stuff that is included (printStackTrace and System.out) Brock
        Hide
        hiveqa Hive QA added a comment -

        Overall: +1 all checks pass

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12620811/HIVE-3746.7.patch.txt

        SUCCESS: +1 4818 tests passed

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/770/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/770/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        

        This message is automatically generated.

        ATTACHMENT ID: 12620811

        Show
        hiveqa Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12620811/HIVE-3746.7.patch.txt SUCCESS: +1 4818 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/770/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/770/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12620811
        Hide
        hiveqa Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12619276/HIVE-3746.6.patch.txt

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/686/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/686/console

        Messages:

        **** This message was trimmed, see log for full details ****
        [INFO] Including org.json:json:jar:20090211 in the shaded jar.
        [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar.
        [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded jar.
        [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar.
        [INFO] Excluding com.sun.jersey:jersey-core:jar:1.14 from the shaded jar.
        [INFO] Excluding com.sun.jersey:jersey-json:jar:1.14 from the shaded jar.
        [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar.
        [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar.
        [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar.
        [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar.
        [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar.
        [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.2 from the shaded jar.
        [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.2 from the shaded jar.
        [INFO] Excluding com.sun.jersey:jersey-server:jar:1.14 from the shaded jar.
        [INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
        [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded jar.
        [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar.
        [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar.
        [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar.
        [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar.
        [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
        [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar.
        [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar.
        [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar.
        [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar.
        [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar.
        [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar.
        [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar.
        [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar.
        [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar.
        [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.5 from the shaded jar.
        [INFO] Replacing original artifact with shaded artifact.
        [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar
        [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml
        [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Service 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service ---
        [INFO] Compiling 165 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service ---
        [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT-tests.jar
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive JDBC 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc ---
        [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol
        symbol  : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion)
        location: class org.apache.hive.jdbc.HiveBaseResultSet
        [INFO] 1 error
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [4.652s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [7.207s]
        [INFO] Hive Shims Common ................................. SUCCESS [2.862s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [2.002s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [2.749s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [1.416s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [3.131s]
        [INFO] Hive Shims ........................................ SUCCESS [3.813s]
        [INFO] Hive Common ....................................... SUCCESS [8.180s]
        [INFO] Hive Serde ........................................ SUCCESS [11.959s]
        [INFO] Hive Metastore .................................... SUCCESS [26.266s]
        [INFO] Hive Query Language ............................... SUCCESS [56.032s]
        [INFO] Hive Service ...................................... SUCCESS [5.523s]
        [INFO] Hive JDBC ......................................... FAILURE [1.268s]
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 2:20.084s
        [INFO] Finished at: Wed Dec 18 03:44:29 EST 2013
        [INFO] Final Memory: 50M/466M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-jdbc: Compilation failure
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol
        [ERROR] symbol  : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion)
        [ERROR] location: class org.apache.hive.jdbc.HiveBaseResultSet
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-jdbc
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12619276

        Show
        hiveqa Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12619276/HIVE-3746.6.patch.txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/686/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/686/console Messages: **** This message was trimmed, see log for full details **** [INFO] Including org.json:json:jar:20090211 in the shaded jar. [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar. [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded jar. [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-core:jar:1.14 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-json:jar:1.14 from the shaded jar. [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar. [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar. [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar. [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar. [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.2 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.2 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-server:jar:1.14 from the shaded jar. [INFO] Excluding asm:asm:jar:3.1 from the shaded jar. [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded jar. [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar. [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar. [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar. [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar. [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar. [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar. [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar. [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar. [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar. [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.5 from the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Service 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service --- [INFO] Compiling 165 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service --- [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive JDBC 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc --- [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol symbol : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion) location: class org.apache.hive.jdbc.HiveBaseResultSet [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [4.652s] [INFO] Hive Ant Utilities ................................ SUCCESS [7.207s] [INFO] Hive Shims Common ................................. SUCCESS [2.862s] [INFO] Hive Shims 0.20 ................................... SUCCESS [2.002s] [INFO] Hive Shims Secure Common .......................... SUCCESS [2.749s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.416s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.131s] [INFO] Hive Shims ........................................ SUCCESS [3.813s] [INFO] Hive Common ....................................... SUCCESS [8.180s] [INFO] Hive Serde ........................................ SUCCESS [11.959s] [INFO] Hive Metastore .................................... SUCCESS [26.266s] [INFO] Hive Query Language ............................... SUCCESS [56.032s] [INFO] Hive Service ...................................... SUCCESS [5.523s] [INFO] Hive JDBC ......................................... FAILURE [1.268s] [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:20.084s [INFO] Finished at: Wed Dec 18 03:44:29 EST 2013 [INFO] Final Memory: 50M/466M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-jdbc: Compilation failure [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol [ERROR] symbol : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion) [ERROR] location: class org.apache.hive.jdbc.HiveBaseResultSet [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-jdbc + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12619276
        Hide
        hiveqa Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12619267/HIVE-3746.5.patch.txt

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/685/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/685/console

        Messages:

        **** This message was trimmed, see log for full details ****
        [INFO] Including org.json:json:jar:20090211 in the shaded jar.
        [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar.
        [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded jar.
        [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar.
        [INFO] Excluding com.sun.jersey:jersey-core:jar:1.14 from the shaded jar.
        [INFO] Excluding com.sun.jersey:jersey-json:jar:1.14 from the shaded jar.
        [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar.
        [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar.
        [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar.
        [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar.
        [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar.
        [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.2 from the shaded jar.
        [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.2 from the shaded jar.
        [INFO] Excluding com.sun.jersey:jersey-server:jar:1.14 from the shaded jar.
        [INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
        [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded jar.
        [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar.
        [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar.
        [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar.
        [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar.
        [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
        [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar.
        [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded jar.
        [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar.
        [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar.
        [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar.
        [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar.
        [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar.
        [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar.
        [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar.
        [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar.
        [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.5 from the shaded jar.
        [INFO] Replacing original artifact with shaded artifact.
        [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar
        [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml
        [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive Service 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service ---
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added.
        [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service ---
        [INFO] Compiling 165 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/classes
        [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or overrides a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service ---
        [INFO] Executing tasks
        
        main:
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse
            [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
             [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service ---
        [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/test-classes
        [INFO] 
        [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service ---
        [INFO] Tests are skipped.
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar
        [INFO] 
        [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service ---
        [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar
        [INFO] 
        [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service ---
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.jar
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom
        [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT-tests.jar
        [INFO]                                                                         
        [INFO] ------------------------------------------------------------------------
        [INFO] Building Hive JDBC 0.13.0-SNAPSHOT
        [INFO] ------------------------------------------------------------------------
        [INFO] 
        [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc ---
        [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (includes = [datanucleus.log, derby.log], excludes = [])
        [INFO] 
        [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc ---
        [debug] execute contextualize
        [INFO] Using 'UTF-8' encoding to copy filtered resources.
        [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources
        [INFO] 
        [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc ---
        [INFO] Executing tasks
        
        main:
        [INFO] Executed tasks
        [INFO] 
        [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc ---
        [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/classes
        [INFO] -------------------------------------------------------------
        [WARNING] COMPILATION WARNING : 
        [INFO] -------------------------------------------------------------
        [WARNING] Note: Some input files use or override a deprecated API.
        [WARNING] Note: Recompile with -Xlint:deprecation for details.
        [WARNING] Note: Some input files use unchecked or unsafe operations.
        [WARNING] Note: Recompile with -Xlint:unchecked for details.
        [INFO] 4 warnings 
        [INFO] -------------------------------------------------------------
        [INFO] -------------------------------------------------------------
        [ERROR] COMPILATION ERROR : 
        [INFO] -------------------------------------------------------------
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol
        symbol  : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion)
        location: class org.apache.hive.jdbc.HiveBaseResultSet
        [INFO] 1 error
        [INFO] -------------------------------------------------------------
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO] 
        [INFO] Hive .............................................. SUCCESS [4.733s]
        [INFO] Hive Ant Utilities ................................ SUCCESS [7.838s]
        [INFO] Hive Shims Common ................................. SUCCESS [3.038s]
        [INFO] Hive Shims 0.20 ................................... SUCCESS [1.840s]
        [INFO] Hive Shims Secure Common .......................... SUCCESS [2.807s]
        [INFO] Hive Shims 0.20S .................................. SUCCESS [1.428s]
        [INFO] Hive Shims 0.23 ................................... SUCCESS [3.298s]
        [INFO] Hive Shims ........................................ SUCCESS [3.872s]
        [INFO] Hive Common ....................................... SUCCESS [14.402s]
        [INFO] Hive Serde ........................................ SUCCESS [16.462s]
        [INFO] Hive Metastore .................................... SUCCESS [25.901s]
        [INFO] Hive Query Language ............................... SUCCESS [55.551s]
        [INFO] Hive Service ...................................... SUCCESS [4.094s]
        [INFO] Hive JDBC ......................................... FAILURE [1.269s]
        [INFO] Hive Beeline ...................................... SKIPPED
        [INFO] Hive CLI .......................................... SKIPPED
        [INFO] Hive Contrib ...................................... SKIPPED
        [INFO] Hive HBase Handler ................................ SKIPPED
        [INFO] Hive HCatalog ..................................... SKIPPED
        [INFO] Hive HCatalog Core ................................ SKIPPED
        [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
        [INFO] Hive HCatalog Server Extensions ................... SKIPPED
        [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
        [INFO] Hive HCatalog Webhcat ............................. SKIPPED
        [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
        [INFO] Hive HWI .......................................... SKIPPED
        [INFO] Hive ODBC ......................................... SKIPPED
        [INFO] Hive Shims Aggregator ............................. SKIPPED
        [INFO] Hive TestUtils .................................... SKIPPED
        [INFO] Hive Packaging .................................... SKIPPED
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD FAILURE
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 2:29.486s
        [INFO] Finished at: Wed Dec 18 03:12:01 EST 2013
        [INFO] Final Memory: 68M/512M
        [INFO] ------------------------------------------------------------------------
        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-jdbc: Compilation failure
        [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol
        [ERROR] symbol  : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion)
        [ERROR] location: class org.apache.hive.jdbc.HiveBaseResultSet
        [ERROR] -> [Help 1]
        [ERROR] 
        [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
        [ERROR] Re-run Maven using the -X switch to enable full debug logging.
        [ERROR] 
        [ERROR] For more information about the errors and possible solutions, please read the following articles:
        [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
        [ERROR] 
        [ERROR] After correcting the problems, you can resume the build with the command
        [ERROR]   mvn <goals> -rf :hive-jdbc
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12619267

        Show
        hiveqa Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12619267/HIVE-3746.5.patch.txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/685/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/685/console Messages: **** This message was trimmed, see log for full details **** [INFO] Including org.json:json:jar:20090211 in the shaded jar. [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar. [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded jar. [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-core:jar:1.14 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-json:jar:1.14 from the shaded jar. [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar. [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar. [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar. [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar. [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.2 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.2 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-server:jar:1.14 from the shaded jar. [INFO] Excluding asm:asm:jar:3.1 from the shaded jar. [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded jar. [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar. [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar. [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar. [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar. [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar. [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar. [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar. [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar. [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar. [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.5 from the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/dependency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive Service 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-service --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/service/src/gen/thrift/gen-javabean added. [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-service --- [INFO] Compiling 165 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/service/src/test/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/tmp/conf [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-service --- [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trunk-source/service/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar [INFO] [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-service --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Hive JDBC 0.13.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (includes = [datanucleus.log, derby.log], excludes = []) [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/main/resources [INFO] [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc --- [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-trunk-source/jdbc/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol symbol : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion) location: class org.apache.hive.jdbc.HiveBaseResultSet [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Hive .............................................. SUCCESS [4.733s] [INFO] Hive Ant Utilities ................................ SUCCESS [7.838s] [INFO] Hive Shims Common ................................. SUCCESS [3.038s] [INFO] Hive Shims 0.20 ................................... SUCCESS [1.840s] [INFO] Hive Shims Secure Common .......................... SUCCESS [2.807s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.428s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.298s] [INFO] Hive Shims ........................................ SUCCESS [3.872s] [INFO] Hive Common ....................................... SUCCESS [14.402s] [INFO] Hive Serde ........................................ SUCCESS [16.462s] [INFO] Hive Metastore .................................... SUCCESS [25.901s] [INFO] Hive Query Language ............................... SUCCESS [55.551s] [INFO] Hive Service ...................................... SUCCESS [4.094s] [INFO] Hive JDBC ......................................... FAILURE [1.269s] [INFO] Hive Beeline ...................................... SKIPPED [INFO] Hive CLI .......................................... SKIPPED [INFO] Hive Contrib ...................................... SKIPPED [INFO] Hive HBase Handler ................................ SKIPPED [INFO] Hive HCatalog ..................................... SKIPPED [INFO] Hive HCatalog Core ................................ SKIPPED [INFO] Hive HCatalog Pig Adapter ......................... SKIPPED [INFO] Hive HCatalog Server Extensions ................... SKIPPED [INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED [INFO] Hive HCatalog Webhcat ............................. SKIPPED [INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED [INFO] Hive HWI .......................................... SKIPPED [INFO] Hive ODBC ......................................... SKIPPED [INFO] Hive Shims Aggregator ............................. SKIPPED [INFO] Hive TestUtils .................................... SKIPPED [INFO] Hive Packaging .................................... SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:29.486s [INFO] Finished at: Wed Dec 18 03:12:01 EST 2013 [INFO] Final Memory: 68M/512M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-jdbc: Compilation failure [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/jdbc/src/java/org/apache/hive/jdbc/HiveMetaDataResultSet.java:[34,5] cannot find symbol [ERROR] symbol : constructor HiveBaseResultSet(org.apache.hive.service.cli.thrift.TProtocolVersion) [ERROR] location: class org.apache.hive.jdbc.HiveBaseResultSet [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-jdbc + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12619267
        Hide
        navis Navis added a comment -

        Rebased to trunk

        Show
        navis Navis added a comment - Rebased to trunk
        Hide
        hiveqa Hive QA added a comment -

        Overall: -1 no tests executed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12619227/HIVE-3746.4.patch.txt

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/680/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/680/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Tests exited with: NonZeroExitCodeException
        Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
        + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
        + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
        + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
        + cd /data/hive-ptest/working/
        + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-680/source-prep.txt
        + [[ false == \t\r\u\e ]]
        + mkdir -p maven ivy
        + [[ svn = \s\v\n ]]
        + [[ -n '' ]]
        + [[ -d apache-svn-trunk-source ]]
        + [[ ! -d apache-svn-trunk-source/.svn ]]
        + [[ ! -d apache-svn-trunk-source ]]
        + cd apache-svn-trunk-source
        + svn revert -R .
        ++ awk '{print $2}'
        ++ egrep -v '^X|^Performing status on external'
        ++ svn status --no-ignore
        + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen contrib/target service/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target
        + svn update
        
        Fetching external item into 'hcatalog/src/test/e2e/harness'
        External at revision 1551852.
        
        At revision 1551852.
        + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
        + patchFilePath=/data/hive-ptest/working/scratch/build.patch
        + [[ -f /data/hive-ptest/working/scratch/build.patch ]]
        + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
        + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
        The patch does not appear to apply with p0, p1, or p2
        + exit 1
        '
        

        This message is automatically generated.

        ATTACHMENT ID: 12619227

        Show
        hiveqa Hive QA added a comment - Overall : -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12619227/HIVE-3746.4.patch.txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/680/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/680/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]] + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-680/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ svn = \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . ++ awk '{print $2}' ++ egrep -v '^X|^Performing status on external' ++ svn status --no-ignore + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen contrib/target service/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target + svn update Fetching external item into 'hcatalog/src/test/e2e/harness' External at revision 1551852. At revision 1551852. + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch The patch does not appear to apply with p0, p1, or p2 + exit 1 ' This message is automatically generated. ATTACHMENT ID: 12619227
        Hide
        navis Navis added a comment -

        Running test

        Show
        navis Navis added a comment - Running test
        Hide
        hiveqa Hive QA added a comment -

        Overall: +1 all checks pass

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12618851/HIVE-3746.3.patch.txt

        SUCCESS: +1 4785 tests passed

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/647/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/647/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        

        This message is automatically generated.

        ATTACHMENT ID: 12618851

        Show
        hiveqa Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12618851/HIVE-3746.3.patch.txt SUCCESS: +1 4785 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/647/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/647/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12618851
        Hide
        overcoil George Chow added a comment -

        My team noticed the breaking change too but the protocol version indicator is available to at least detect this.

        Show
        overcoil George Chow added a comment - My team noticed the breaking change too but the protocol version indicator is available to at least detect this.
        Hide
        cwsteinbach Carl Steinbach added a comment -

        Navis Thanks for working on this! I added some comments on RB. My main concern with the current patch is that it breaks backward compatibility with older clients.

        Show
        cwsteinbach Carl Steinbach added a comment - Navis Thanks for working on this! I added some comments on RB. My main concern with the current patch is that it breaks backward compatibility with older clients.
        Hide
        hiveqa Hive QA added a comment -

        Overall: -1 at least one tests failed

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12618843/HIVE-3746.2.patch.txt

        ERROR: -1 due to 1 failed/errored test(s), 4785 tests executed
        Failed tests:

        org.apache.hive.jdbc.TestJdbcDriver2.testDataTypes
        

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/645/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/645/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        Tests exited with: TestsFailedException: 1 tests failed
        

        This message is automatically generated.

        ATTACHMENT ID: 12618843

        Show
        hiveqa Hive QA added a comment - Overall : -1 at least one tests failed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12618843/HIVE-3746.2.patch.txt ERROR: -1 due to 1 failed/errored test(s), 4785 tests executed Failed tests: org.apache.hive.jdbc.TestJdbcDriver2.testDataTypes Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/645/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/645/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 1 tests failed This message is automatically generated. ATTACHMENT ID: 12618843
        Hide
        navis Navis added a comment -

        Rebased to trunk

        Show
        navis Navis added a comment - Rebased to trunk
        Hide
        hiveqa Hive QA added a comment -

        Overall: +1 all checks pass

        Here are the results of testing the latest attachment:
        https://issues.apache.org/jira/secure/attachment/12617332/HIVE-3746.1.patch.txt

        SUCCESS: +1 4458 tests passed

        Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/548/testReport
        Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/548/console

        Messages:

        Executing org.apache.hive.ptest.execution.PrepPhase
        Executing org.apache.hive.ptest.execution.ExecutionPhase
        Executing org.apache.hive.ptest.execution.ReportingPhase
        

        This message is automatically generated.

        ATTACHMENT ID: 12617332

        Show
        hiveqa Hive QA added a comment - Overall : +1 all checks pass Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12617332/HIVE-3746.1.patch.txt SUCCESS: +1 4458 tests passed Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/548/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/548/console Messages: Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase This message is automatically generated. ATTACHMENT ID: 12617332
        Hide
        navis Navis added a comment -

        I have a patch similar to this one based on hive-0.11.0. I'll check if it's possible to be rebased on trunk.

        Show
        navis Navis added a comment - I have a patch similar to this one based on hive-0.11.0. I'll check if it's possible to be rebased on trunk.
        Hide
        qwert a bc added a comment -

        hiveserver2 is much slower than hiveserver1?

        we are building ms sql cube by linkedserver connectiong hiveserver with Cloudera's ODBC driver.

        There are two test results:

        1. hiveserver1 running on 2CPUs, 8G mem, took about 8 hours
        2. hiveserver2 running on 4CPUs, 16 mem, took about 13 hours and 27min (never successful on machine with 2CPUs, 8G mem)

        Although on both cases, almost all CPUs are busy when building cube.

        But I cannot understand why hiveserver2 is much slower than hiveserver1, because from doc, hs2 support concurrency, it should be faster than hs1, isn't it?

        CDH4.3 on CentOS6.

        Show
        qwert a bc added a comment - hiveserver2 is much slower than hiveserver1? we are building ms sql cube by linkedserver connectiong hiveserver with Cloudera's ODBC driver. There are two test results: 1. hiveserver1 running on 2CPUs, 8G mem, took about 8 hours 2. hiveserver2 running on 4CPUs, 16 mem, took about 13 hours and 27min (never successful on machine with 2CPUs, 8G mem) Although on both cases, almost all CPUs are busy when building cube. But I cannot understand why hiveserver2 is much slower than hiveserver1, because from doc, hs2 support concurrency, it should be faster than hs1, isn't it? CDH4.3 on CentOS6.
        Hide
        cwsteinbach Carl Steinbach added a comment -

        I also want to add that the HS2 API is capable of supporting multiple resultset serialization formats. Clearly the original resultset serialization format is a failure and needs to be deprecated in favor of something else. I outlined one alternative format in the comments above, but it probably makes more sense to first add support for the serialization format used by HS1.

        Show
        cwsteinbach Carl Steinbach added a comment - I also want to add that the HS2 API is capable of supporting multiple resultset serialization formats. Clearly the original resultset serialization format is a failure and needs to be deprecated in favor of something else. I outlined one alternative format in the comments above, but it probably makes more sense to first add support for the serialization format used by HS1.
        Hide
        cwsteinbach Carl Steinbach added a comment -

        Updating the summary of this ticket and removing myself as the assignee since I won't be working on it anytime soon.

        Show
        cwsteinbach Carl Steinbach added a comment - Updating the summary of this ticket and removing myself as the assignee since I won't be working on it anytime soon.
        Hide
        pprudich Phil Prudich added a comment -

        This limitation is a consequence of the fact that we're using a message oriented RPC layer (Thrift) to handle communication and data transfer between the client and server.

        This limitation is only present for a client that is linked to the Thrift library. A client that is coded directly to the protocol itself is freed to leave outstanding data on the wire while it returns the initial results.

        Show
        pprudich Phil Prudich added a comment - This limitation is a consequence of the fact that we're using a message oriented RPC layer (Thrift) to handle communication and data transfer between the client and server. This limitation is only present for a client that is linked to the Thrift library. A client that is coded directly to the protocol itself is freed to leave outstanding data on the wire while it returns the initial results.
        Hide
        cwsteinbach Carl Steinbach added a comment -

        If an application has requested a single row, and the client has requested n rows from the server in an effort to reduce round trips, then n-1 intervening values from the first column must be cached off somewhere before the first value for the second column can be accessed.

        If the fetch size is n, then the client is going to end up storing n rows in memory regardless of whether the result set is represented in a row-major or column-major format. Put another way, the unit of data transfer between the server and client is a variable sized resultset. The client has the option of setting the result size very low in order to achieve lower latency, or making it very large in order to get higher overall throughput. However, the key limitation is that the client is not able to provide access to any of the rows contained in a resultset until the entire resultset has been transferred from the server to the client. This limitation is a consequence of the fact that we're using a message oriented RPC layer (Thrift) to handle communication and data transfer between the client and server.

        Show
        cwsteinbach Carl Steinbach added a comment - If an application has requested a single row, and the client has requested n rows from the server in an effort to reduce round trips, then n-1 intervening values from the first column must be cached off somewhere before the first value for the second column can be accessed. If the fetch size is n, then the client is going to end up storing n rows in memory regardless of whether the result set is represented in a row-major or column-major format. Put another way, the unit of data transfer between the server and client is a variable sized resultset. The client has the option of setting the result size very low in order to achieve lower latency, or making it very large in order to get higher overall throughput. However, the key limitation is that the client is not able to provide access to any of the rows contained in a resultset until the entire resultset has been transferred from the server to the client. This limitation is a consequence of the fact that we're using a message oriented RPC layer (Thrift) to handle communication and data transfer between the client and server.
        Hide
        pprudich Phil Prudich added a comment -

        The client is able to specify the maximum number of rows that should be returned in each fetch request.

        Right – that's specifically the case I'm concerned with. If an application has requested a single row, and the client has requested n rows from the server in an effort to reduce round trips, then n-1 intervening values from the first column must be cached off somewhere before the first value for the second column can be accessed.

        Trying to address this by setting the row limit to 1 would also negatively impact performance by requiring a round trip per row.

        Without an intimate knowledge of the server code, I could see row orientation as an optional argument to TFetchResultsReq, but would encourage that the row-oriented behavior remain an option.

        Show
        pprudich Phil Prudich added a comment - The client is able to specify the maximum number of rows that should be returned in each fetch request. Right – that's specifically the case I'm concerned with. If an application has requested a single row, and the client has requested n rows from the server in an effort to reduce round trips, then n -1 intervening values from the first column must be cached off somewhere before the first value for the second column can be accessed. Trying to address this by setting the row limit to 1 would also negatively impact performance by requiring a round trip per row. Without an intimate knowledge of the server code, I could see row orientation as an optional argument to TFetchResultsReq, but would encourage that the row-oriented behavior remain an option.
        Hide
        cwsteinbach Carl Steinbach added a comment -

        However, any client trying to return rows one-at-a-time to an application would be required to read, process, and buffer almost an entire reply-worth of data before being able to return the first complete row.

        The client is able to specify the maximum number of rows that should be returned in each fetch request. This is true regardless of whether we use a column-major or row-major format for the row data. Please refer to the definition of TFetchResultsReq in TCLIService.thrift for more information.

        Show
        cwsteinbach Carl Steinbach added a comment - However, any client trying to return rows one-at-a-time to an application would be required to read, process, and buffer almost an entire reply-worth of data before being able to return the first complete row. The client is able to specify the maximum number of rows that should be returned in each fetch request. This is true regardless of whether we use a column-major or row-major format for the row data. Please refer to the definition of TFetchResultsReq in TCLIService.thrift for more information.
        Hide
        pprudich Phil Prudich added a comment -

        To make sure I'm reading the new thrift definitions correctly – does this mean that all rows' column 1 values will come first on the wire, and then be followed by all rows' values for column 2, and so on? I clearly see how this would save bytes on the wire.

        However, any client trying to return rows one-at-a-time to an application would be required to read, process, and buffer almost an entire reply-worth of data before being able to return the first complete row.

        I'm unfamiliar with the server code; but similar buffering may be needed there as well.

        Is my understanding of the issue correct?

        Show
        pprudich Phil Prudich added a comment - To make sure I'm reading the new thrift definitions correctly – does this mean that all rows' column 1 values will come first on the wire, and then be followed by all rows' values for column 2, and so on? I clearly see how this would save bytes on the wire. However, any client trying to return rows one-at-a-time to an application would be required to read, process, and buffer almost an entire reply-worth of data before being able to return the first complete row. I'm unfamiliar with the server code; but similar buffering may be needed there as well. Is my understanding of the issue correct?
        Hide
        cwsteinbach Carl Steinbach added a comment -

        This would probably be more efficient:

        // Represents a rowset
        struct TRowSet {
          // The starting row offset of this rowset.
          1: required i64 startRowOffset
          2: required list<TColumn> columns
        }
        
        struct TColumn {
          1: required list<i32> nullOffsets
          2: required TColumnData columnData
        }
        
        union TColumnData {
          1: list<bool> boolColumn
          2: list<byte> byteColumn
          3: list<i16> i16Column
          4: list<i32> i32Column
          5: list<i64> i64Column
          6: list<double> doubleColumn
          7: list<string> stringColumn
        }
        

        We may be able to make this even more compact by using a run-length encoding scheme for the nullOffset vector (and possibly the ColumnData list too).

        Show
        cwsteinbach Carl Steinbach added a comment - This would probably be more efficient: // Represents a rowset struct TRowSet { // The starting row offset of this rowset. 1: required i64 startRowOffset 2: required list<TColumn> columns } struct TColumn { 1: required list<i32> nullOffsets 2: required TColumnData columnData } union TColumnData { 1: list<bool> boolColumn 2: list<byte> byteColumn 3: list<i16> i16Column 4: list<i32> i32Column 5: list<i64> i64Column 6: list<double> doubleColumn 7: list<string> stringColumn } We may be able to make this even more compact by using a run-length encoding scheme for the nullOffset vector (and possibly the ColumnData list too).
        Hide
        cwsteinbach Carl Steinbach added a comment -

        Currently HS2 uses the following Thrift structures to represent a resultset:

        // Represents a rowset
        struct TRowSet {
          // The starting row offset of this rowset.
          1: required i64 startRowOffset
          2: required list<TRow> rows
        }
        
        // Represents a row in a rowset.
        struct TRow {
          1: required list<TColumnValue> colVals
        }
        
        union TColumnValue {
          1: TBoolValue   boolVal      // BOOLEAN
          2: TByteValue   byteVal      // TINYINT
          3: TI16Value    i16Val       // SMALLINT
          4: TI32Value    i32Val       // INT
          5: TI64Value    i64Val       // BIGINT, TIMESTAMP
          6: TDoubleValue doubleVal    // FLOAT, DOUBLE
          7: TStringValue stringVal    // STRING, LIST, MAP, STRUCT, UNIONTYPE, BINARY
        }
        
        // A Boolean column value.
        struct TBoolValue {
          // NULL if value is unset.
          1: optional bool value
        }
        
        ...
        
        struct TStringValue {
          1: optional string value
        }
        

        This problem with this approach is that Thrift unions are not very efficient, and we pay this cost on a per-field basis. Instead, we should make the result set structure column-oriented as follows:

        // Represents a rowset
        struct TRowSet {
          // The starting row offset of this rowset.
          1: required i64 startRowOffset
          2: required list<TColumn> columns
        }
        
        union TColumn {
          1: list<TBoolValue> boolColumn
          2: list<TByteValue> byteColumn
          3: list<TI16Value> i16Column
          4: list<TI32Value> i32Column
          5: list<TI64Value> i64Column
          6: list<TDoubleValue> doubleColumn
          7: list<TStringValue> stringColumn
        }
        
        Show
        cwsteinbach Carl Steinbach added a comment - Currently HS2 uses the following Thrift structures to represent a resultset: // Represents a rowset struct TRowSet { // The starting row offset of this rowset. 1: required i64 startRowOffset 2: required list<TRow> rows } // Represents a row in a rowset. struct TRow { 1: required list<TColumnValue> colVals } union TColumnValue { 1: TBoolValue boolVal // BOOLEAN 2: TByteValue byteVal // TINYINT 3: TI16Value i16Val // SMALLINT 4: TI32Value i32Val // INT 5: TI64Value i64Val // BIGINT, TIMESTAMP 6: TDoubleValue doubleVal // FLOAT, DOUBLE 7: TStringValue stringVal // STRING, LIST, MAP, STRUCT, UNIONTYPE, BINARY } // A Boolean column value. struct TBoolValue { // NULL if value is unset. 1: optional bool value } ... struct TStringValue { 1: optional string value } This problem with this approach is that Thrift unions are not very efficient, and we pay this cost on a per-field basis. Instead, we should make the result set structure column-oriented as follows: // Represents a rowset struct TRowSet { // The starting row offset of this rowset. 1: required i64 startRowOffset 2: required list<TColumn> columns } union TColumn { 1: list<TBoolValue> boolColumn 2: list<TByteValue> byteColumn 3: list<TI16Value> i16Column 4: list<TI32Value> i32Column 5: list<TI64Value> i64Column 6: list<TDoubleValue> doubleColumn 7: list<TStringValue> stringColumn }

          People

          • Assignee:
            navis Navis
            Reporter:
            cwsteinbach Carl Steinbach
          • Votes:
            3 Vote for this issue
            Watchers:
            21 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development