20:58:35 /data/jenkins/workspace/impala-asf-master-core-data-load/repos/Impala/bin/load-data.py --workloads functional-query -e exhaustive --impalad localhost:21000 --hive_hs2_hostport localhost:11050 --hdfs_namenode localhost:20500
20:58:35 Starting data load for the following workloads: functional-query
20:58:35 Running with 8 threads
20:58:35 Executing Generate Schema Command: generate-schema-statements.py --exploration_strategy=exhaustive --workload=functional-query --scale_factor= --verbose --hive_warehouse_dir=/test-warehouse --hdfs_namenode=localhost:20500 --backend=localhost:21000
Target Dataset: functional
HDFS path: /test-warehouse/alltypes does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/alltypesnopart does not exists or is empty. Data will be loaded.
Empty base table load for alltypesnopart. Skipping load generation
HDFS path: /test-warehouse/alltypessmall does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/alltypestiny does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/alltypesinsert does not exists or is empty. Data will be loaded.
Empty base table load for alltypesinsert. Skipping load generation
HDFS path: /test-warehouse/alltypesnopart_insert does not exists or is empty. Data will be loaded.
Empty base table load for alltypesnopart_insert. Skipping load generation
HDFS path: /test-warehouse/insert_overwrite_nopart does not exists or is empty. Data will be loaded.
Empty base table load for insert_overwrite_nopart. Skipping load generation
HDFS path: /test-warehouse/insert_overwrite_partitioned does not exists or is empty. Data will be loaded.
Empty base table load for insert_overwrite_partitioned. Skipping load generation
HDFS path: /test-warehouse/insert_string_partitioned does not exists or is empty. Data will be loaded.
Empty base table load for insert_string_partitioned. Skipping load generation
HDFS path: /test-warehouse/alltypeserror does not exists or is empty. Data will be loaded.
Skipping 'functional.hbasealltypeserror' due to include constraint match.
Skipping 'functional.hbasecolumnfamilies' due to include constraint match.
HDFS path: /test-warehouse/alltypeserrornonulls does not exists or is empty. Data will be loaded.
Skipping 'functional.hbasealltypeserrornonulls' due to include constraint match.
HDFS path: /test-warehouse/alltypesagg does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/alltypesaggnonulls does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/allcomplextypes does not exists or is empty. Data will be loaded.
Empty base table load for allcomplextypes. Skipping load generation
Skipping 'functional.complextypestbl' due to include constraint match.
HDFS path: /test-warehouse/complextypes_fileformat does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/complextypes_multifileformat does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/testtbl does not exists or is empty. Data will be loaded.
Empty base table load for testtbl. Skipping load generation
HDFS path: /test-warehouse/dimtbl does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/jointbl does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/liketbl does not exists or is empty. Data will be loaded.
Skipping 'functional.alltypessmallbinary' due to include constraint match.
Skipping 'functional.insertalltypesaggbinary' due to include constraint match.
Skipping 'functional.insertalltypesagg' due to include constraint match.
Skipping 'functional.stringids' due to include constraint match.
HDFS path: /test-warehouse/alltypes_view does not exists or is empty. Data will be loaded.
Empty base table load for alltypes_view. Skipping load generation
HDFS path: /test-warehouse/alltypes_hive_view does not exists or is empty. Data will be loaded.
Empty base table load for alltypes_hive_view. Skipping load generation
HDFS path: /test-warehouse/alltypes_view_sub does not exists or is empty. Data will be loaded.
Empty base table load for alltypes_view_sub. Skipping load generation
HDFS path: /test-warehouse/complex_view does not exists or is empty. Data will be loaded.
Empty base table load for complex_view. Skipping load generation
HDFS path: /test-warehouse/view_view does not exists or is empty. Data will be loaded.
Empty base table load for view_view. Skipping load generation
Skipping 'functional.subquery_view' due to include constraint match.
HDFS path: /test-warehouse/alltypes_parens does not exists or is empty. Data will be loaded.
Empty base table load for alltypes_parens. Skipping load generation
HDFS path: /test-warehouse/text_comma_backslash_newline does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/text_dollar_hash_pipe does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/text_thorn_ecirc_newline does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/overflow does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/widerow does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/greptiny does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/rankingssmall does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/uservisitssmall does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/emptytable does not exists or is empty. Data will be loaded.
Empty base table load for emptytable. Skipping load generation
HDFS path: /test-warehouse/alltypesaggmultifiles does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/alltypesaggmultifilesnopart does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/stringpartitionkey does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/tinytable does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/tinyinttable does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/nulltable does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/nullescapedtable does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/nullformat_custom does not exists or is empty. Data will be loaded.
Empty base table load for nullformat_custom. Skipping load generation
HDFS path: /test-warehouse/TblWithRaggedColumns does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/nullinsert does not exists or is empty. Data will be loaded.
Empty base table load for nullinsert. Skipping load generation
HDFS path: /test-warehouse/zipcode_incomes does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/unsupported_types does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/unsupported_partition_types does not exists or is empty. Data will be loaded.
Empty base table load for unsupported_partition_types. Skipping load generation
Skipping 'functional.old_rcfile_table' due to include constraint match.
Skipping 'functional.bad_text_lzo' due to include constraint match.
Skipping 'functional.bad_text_gzip' due to include constraint match.
Skipping 'functional.bad_seq_snap' due to include constraint match.
Skipping 'functional.bad_avro_snap_strings' due to include constraint match.
Skipping 'functional.bad_avro_snap_floats' due to include constraint match.
Skipping 'functional.bad_avro_decimal_schema' due to include constraint match.
Skipping 'functional.bad_parquet' due to include constraint match.
Skipping 'functional.bad_parquet_strings_negative_len' due to include constraint match.
Skipping 'functional.bad_parquet_strings_out_of_bounds' due to include constraint match.
Skipping 'functional.bad_magic_number' due to include constraint match.
Skipping 'functional.alltypesagg_hive_13_1' due to include constraint match.
Skipping 'functional.bad_metadata_len' due to include constraint match.
Skipping 'functional.bad_dict_page_offset' due to include constraint match.
Skipping 'functional.bad_compressed_size' due to include constraint match.
Skipping 'functional.kite_required_fields' due to include constraint match.
Skipping 'functional.bad_column_metadata' due to include constraint match.
HDFS path: /test-warehouse/bad_serde does not exists or is empty. Data will be loaded.
Empty base table load for bad_serde. Skipping load generation
Skipping 'functional.rcfile_lazy_binary_serde' due to include constraint match.
HDFS path: /test-warehouse/decimal_tbl does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/decimal_tiny does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/chars_tiny does not exists or is empty. Data will 18/06/04 21:20:26 WARN hdfs.DataStreamer: Exception for BP-1407206351-127.0.0.1-1528170335185:blk_1073743620_2799
java.net.SocketTimeoutException: 75000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:56483 remote=/127.0.0.1:31000]
	at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118)
	at java.io.FilterInputStream.read(FilterInputStream.java:83)
	at java.io.FilterInputStream.read(FilterInputStream.java:83)
	at org.apache.hadoop.hdfs.protocolPB.PBHelperClient.vintPrefixed(PBHelperClient.java:446)
	at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:213)
	at org.apache.hadoop.hdfs.DataStreamer$ResponseProcessor.run(DataStreamer.java:1086)
18/06/04 21:20:29 WARN hdfs.DataStreamer: Error Recovery for BP-1407206351-127.0.0.1-1528170335185:blk_1073743620_2799 in pipeline [DatanodeInfoWithStorage[127.0.0.1:31000,DS-37cfc57c-ab39-443c-80c9-e440cb18b63d,DISK], DatanodeInfoWithStorage[127.0.0.1:31001,DS-2bc41558-4f2c-460f-ae87-5d1a6acbf42f,DISK], DatanodeInfoWithStorage[127.0.0.1:31002,DS-4ba4d3a0-af31-4eaf-b43d-89b408231481,DISK]]: datanode 0(DatanodeInfoWithStorage[127.0.0.1:31000,DS-37cfc57c-ab39-443c-80c9-e440cb18b63d,DISK]) is bad.
18/06/04 21:21:29 INFO hdfs.DataStreamer: Exception in createBlockOutputStream blk_1073743620_2799
java.io.IOException: Got error, status=ERROR, status message , ack with firstBadLink as 127.0.0.1:31002
	at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134)
	at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:110)
	at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1778)
	at org.apache.hadoop.hdfs.DataStreamer.setupPipelineInternal(DataStreamer.java:1507)
	at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1481)
	at org.apache.hadoop.hdfs.DataStreamer.processDatanodeOrExternalError(DataStreamer.java:1256)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:667)
18/06/04 21:21:29 WARN hdfs.DataStreamer: Error Recovery for BP-1407206351-127.0.0.1-1528170335185:blk_1073743620_2799 in pipeline [DatanodeInfoWithStorage[127.0.0.1:31001,DS-2bc41558-4f2c-460f-ae87-5d1a6acbf42f,DISK], DatanodeInfoWithStorage[127.0.0.1:31002,DS-4ba4d3a0-af31-4eaf-b43d-89b408231481,DISK]]: datanode 1(DatanodeInfoWithStorage[127.0.0.1:31002,DS-4ba4d3a0-af31-4eaf-b43d-89b408231481,DISK]) is bad.
18/06/04 21:21:29 WARN hdfs.DataStreamer: DataStreamer Exception
java.io.IOException: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[127.0.0.1:31001,DS-2bc41558-4f2c-460f-ae87-5d1a6acbf42f,DISK]], original=[DatanodeInfoWithStorage[127.0.0.1:31001,DS-2bc41558-4f2c-460f-ae87-5d1a6acbf42f,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via 'dfs.client.block.write.replace-datanode-on-failure.policy' in its configuration.
	at org.apache.hadoop.hdfs.DataStreamer.findNewDatanode(DataStreamer.java:1304)
	at org.apache.hadoop.hdfs.DataStreamer.addDatanode2ExistingPipeline(DataStreamer.java:1372)
	at org.apache.hadoop.hdfs.DataStreamer.handleDatanodeReplacement(DataStreamer.java:1598)
	at org.apache.hadoop.hdfs.DataStreamer.setupPipelineInternal(DataStreamer.java:1499)
	at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1481)
	at org.apache.hadoop.hdfs.DataStreamer.processDatanodeOrExternalError(DataStreamer.java:1256)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:667)
put: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[127.0.0.1:31001,DS-2bc41558-4f2c-460f-ae87-5d1a6acbf42f,DISK]], original=[DatanodeInfoWithStorage[127.0.0.1:31001,DS-2bc41558-4f2c-460f-ae87-5d1a6acbf42f,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via 'dfs.client.block.write.replace-datanode-on-failure.policy' in its configuration.
18/06/04 21:24:25 INFO hdfs.DFSClient: Could not complete /test-warehouse/testescape_17_crlf/126._COPYING_ retrying...
be loaded.
Empty base table load for chars_tiny. Skipping load generation
HDFS path: /test-warehouse/widetable_250_cols does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/widetable_500_cols does not exists or is empty. Data will be loaded.
HDFS path: /test-warehouse/widetable_1000_cols does not exists or is empty. Data will be loaded.
Skipping 'functional.avro_decimal_tbl' due to include constraint match.
Skipping 'functional.no_avro_schema' due to include constraint match.
HDFS path: /test-warehouse/table_no_newline does not exists or is empty. Data will be loaded.
Empty base table load for table_no_newline. Skipping load generation
HDFS path: /test-warehouse/table_no_newline_part does not exists or is empty. Data will be loaded.
Empty base table load for table_no_newline_part. Skipping load generation
HDFS path: /test-warehouse/testescape_16_lf does not exists or is empty. Data will be loaded.
Empty base table load for testescape_16_lf. Skipping load generation
HDFS path: /test-warehouse/testescape_16_crlf does not exists or is empty. Data will be loaded.
Empty base table load for testescape_16_crlf. Skipping load generation
HDFS path: /test-warehouse/testescape_17_lf does not exists or is empty. Data will be loaded.
Empty base table load for testescape_17_lf. Skipping load generation
Traceback (most recent call last):
  File "/data/jenkins/workspace/impala-asf-master-core-data-load/repos/Impala/testdata/bin/generate-schema-statements.py", line 836, in <module>
    test_vectors, sections, include_constraints, exclude_constraints, only_constraints)
  File "/data/jenkins/workspace/impala-asf-master-core-data-load/repos/Impala/testdata/bin/generate-schema-statements.py", line 595, in generate_statements
    load = eval_section(section['LOAD'])
  File "/data/jenkins/workspace/impala-asf-master-core-data-load/repos/Impala/testdata/bin/generate-schema-statements.py", line 533, in eval_section
    assert p.returncode == 0
AssertionError
21:35:26 Error generating schema statements for workload: functional-query
