Uploaded image for project: 'IMPALA'
  1. IMPALA
  2. IMPALA-4887

Broken local filesystem TestHdfsParquetTableStatsWriter

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • Impala 2.9.0
    • Impala 2.9.0
    • Backend

    Description

      I believe https://git-wip-us.apache.org/repos/asf?p=incubator-impala.git;a=commit;h=6251d8b4ddac3bdd6fb651f000aea15b7a0d1603 to be the likely culprit.

      =================================== FAILURES ===================================
       TestHdfsParquetTableStatsWriter.test_write_statistics_alltypes[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:364: in test_write_statistics_alltypes
          expected_min_max_values, hive_skip_col_idx)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 3ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      -- connecting to: localhost:21000
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_alltypes_1df32760` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_alltypes_1df32760`;
      
      MainThread: Created database "test_write_statistics_alltypes_1df32760" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_alltypes[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      drop table if exists test_write_statistics_alltypes_1df32760.test_hdfs_parquet_table_writer;
      
       TestHdfsParquetTableStatsWriter.test_write_statistics_decimal[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:375: in test_write_statistics_decimal
          expected_min_max_values, hive_skip_col_idx)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 3ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_decimal_8e4dc3c` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_decimal_8e4dc3c`;
      
      MainThread: Created database "test_write_statistics_decimal_8e4dc3c" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_decimal[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      drop table if exists test_write_statistics_decimal_8e4dc3c.test_hdfs_parquet_table_writer;
      
       TestHdfsParquetTableStatsWriter.test_write_statistics_multi_page[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:397: in test_write_statistics_multi_page
          expected_min_max_values, hive_skip_col_idx)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 3ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_multi_page_b1cc43c7` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_multi_page_b1cc43c7`;
      
      MainThread: Created database "test_write_statistics_multi_page_b1cc43c7" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_multi_page[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      drop table if exists test_write_statistics_multi_page_b1cc43c7.test_hdfs_parquet_table_writer;
      
       TestHdfsParquetTableStatsWriter.test_write_statistics_null[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:407: in test_write_statistics_null
          expected_min_max_values, hive_skip_col_idx)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 3ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_null_1eb44261` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_null_1eb44261`;
      
      MainThread: Created database "test_write_statistics_null_1eb44261" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_null[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      drop table if exists test_write_statistics_null_1eb44261.test_hdfs_parquet_table_writer;
      
       TestHdfsParquetTableStatsWriter.test_write_statistics_char_types[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:417: in test_write_statistics_char_types
          expected_min_max_values, hive_skip_col_idx)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 3ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_char_types_e85462d7` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_char_types_e85462d7`;
      
      MainThread: Created database "test_write_statistics_char_types_e85462d7" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_char_types[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      drop table if exists test_write_statistics_char_types_e85462d7.test_hdfs_parquet_table_writer;
      
       TestHdfsParquetTableStatsWriter.test_write_statistics_negative[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:444: in test_write_statistics_negative
          expected_min_max_values)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 3ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_negative_fe85ecda` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_negative_fe85ecda`;
      
      MainThread: Created database "test_write_statistics_negative_fe85ecda" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_negative[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      create view test_write_statistics_negative_fe85ecda.test_negative_view as select
              id * cast(pow(-1, id % 2) as int) as id,
              int_col * cast(pow(-1, id % 2) as int) as int_col,
              bigint_col * cast(pow(-1, id % 2) as bigint) as bigint_col,
              float_col * pow(-1, id % 2) as float_col,
              double_col * pow(-1, id % 2) as double_col
              from functional.alltypes;
      
      -- executing against localhost:21000
      drop table if exists test_write_statistics_negative_fe85ecda.test_hdfs_parquet_table_writer;
      
       TestHdfsParquetTableStatsWriter.test_write_statistics_float_infinity[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none] 
      [gw3] linux2 -- Python 2.6.6 /data/jenkins/workspace/impala-umbrella-build-and-test/repos/Impala/bin/../infra/python/env/bin/python
      query_test/test_insert_parquet.py:500: in test_write_statistics_float_infinity
          expected_min_max_values)
      query_test/test_insert_parquet.py:324: in _ctas_table_and_verify_stats
          "{1}".format(qualified_table_name, source_table))
      common/impala_test_suite.py:574: in run_stmt_in_hive
          raise RuntimeError(stderr)
      E   RuntimeError: SLF4J: Class path contains multiple SLF4J bindings.
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hbase-1.2.0-cdh5.11.0-SNAPSHOT/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: Found binding in [jar:file:/data/jenkins/workspace/impala-umbrella-build-and-test/Impala-Toolchain/cdh_components/hadoop-2.6.0-cdh5.11.0-SNAPSHOT/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      E   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      E   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      E   scan complete in 4ms
      E   Connecting to jdbc:hive2://localhost:11050
      E   Could not open connection to the HS2 server. Please check the server URI and if the URI is correct, then ask the administrator to check the server status.
      E   Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:11050: java.net.ConnectException: Connection refused (state=08S01,code=0)
      E   No current connection
      ---------------------------- Captured stderr setup -----------------------------
      SET sync_ddl=False;
      -- executing against localhost:21000
      DROP DATABASE IF EXISTS `test_write_statistics_float_infinity_a39d9014` CASCADE;
      
      SET sync_ddl=False;
      -- executing against localhost:21000
      CREATE DATABASE `test_write_statistics_float_infinity_a39d9014`;
      
      MainThread: Created database "test_write_statistics_float_infinity_a39d9014" for test ID "query_test/test_insert_parquet.py::TestHdfsParquetTableStatsWriter::()::test_write_statistics_float_infinity[exec_option: {'disable_codegen': False, 'abort_on_error': 1, 'exec_single_node_rows_threshold': 0, 'batch_size': 0, 'num_nodes': 0} | table_format: parquet/none]"
      ----------------------------- Captured stderr call -----------------------------
      -- executing against localhost:21000
      create table test_write_statistics_float_infinity_a39d9014.test_float_infinity (f float, d double);;
      
      -- executing against localhost:21000
      insert into test_write_statistics_float_infinity_a39d9014.test_float_infinity values
              (cast('-inf' as float), cast('-inf' as double)),
              (cast('inf' as float), cast('inf' as double));
      
      -- executing against localhost:21000
      drop table if exists test_write_statistics_float_infinity_a39d9014.test_hdfs_parquet_table_writer;
      

      Attachments

        Issue Links

          Activity

            People

              lv Lars Volker
              jbapple Jim Apple
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: