Uploaded image for project: 'IMPALA'
  1. IMPALA
  2. IMPALA-6040

test_multi_compression_types uses hive in incompatible environments

    Details

      Description

      On Isilon and Local filesystem so far:

      =================================== FAILURES ===================================
       TestParquet.test_multi_compression_types[exec_option: {'batch_size': 0, 'num_nodes': 0, 'disable_codegen_rows_threshold': 0, 'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_scanners.py:345: in test_multi_compression_types
          check_call(['hive', '-e', hql_format.format(codec="snappy", year=2010, month=1)])
      /usr/lib64/python2.6/subprocess.py:505: in check_call
          raise CalledProcessError(retcode, cmd)
      E   CalledProcessError: Command '['hive', '-e', 'set parquet.compression=snappy;insert into table test_multi_compression_types_cc30cc12.alltypes_multi_compression  partition (year = 2010, month = 1)  select id, bool_col, tinyint_col, smallint_col, int_col, bigint_col,    float_col, double_col,date_string_col,string_col,timestamp_col  from functional_parquet.alltypes  where year = 2010 and month = 1']' returned non-zero exit status 10
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_multi_compression_types_cc30cc12` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_multi_compression_types_cc30cc12`;
      
      MainThread: Created database "test_multi_compression_types_cc30cc12" for test ID "query_test/test_scanners.py::TestParquet::()::test_multi_compression_types[exec_option: {'batch_size': 0, 'num_nodes': 0, 'disable_codegen_rows_threshold': 0, 'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      create table test_multi_compression_types_cc30cc12.alltypes_multi_compression like functional_parquet.alltypes;
      
      SLF4J: Class path contains multiple SLF4J bindings.
      SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.14.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.14.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      17/10/11 03:21:43 WARN conf.HiveConf: HiveConf of name hive.access.conf.url does not exist
      
      Logging initialized using configuration in file:/data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/fe/src/test/resources/hive-log4j.properties
      Unable to acquire IMPLICIT, SHARED lock functional_parquet after 100 attempts.
      Error in acquireLocks...
      FAILED: Error in acquiring locks: Locks on the underlying objects cannot be acquired. retry after some time
      

      Fix is to skip when using these and others.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                mikesbrown Michael Brown
                Reporter:
                mikesbrown Michael Brown
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: