Details
-
Task
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
The following issues are found in data / log checksum match in System Test:
1. kafka_system_test_utils.validate_simple_consumer_data_matched
It reports PASSED even some log segments don't match
2. kafka_system_test_utils.validate_data_matched (this is fixed and patched in local Hudson for some time)
It reports PASSED in the Ack=1 cases even data loss is greater than the tolerance (1%).
3. kafka_system_test_utils.validate_simple_consumer_data_matched
It gets a unique set of MessageID to validate. It should leave all MessageID as is (no dedup needed) and the test case should fail if sorted MessageID don't match across the replicas.
4. There is a data loss tolerance of 1% in the test cases of Ack=1. Currently 1% is too strict and seeing some random failures due to 2 ~ 3% of data loss. It will be increased to 5% such that the System Test will get a more consistent passing rate in those test cases. The following will be updated to 5% tolerance in kafka_system_test_utils:
validate_data_matched
validate_simple_consumer_data_matched
validate_data_matched_in_multi_topics_from_single_consumer_producer