Details
Description
We are running Kafka Connect in a distributed mode on 3 nodes using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector.
Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should.
The issue does not show up if we run only a single node.
The issue is not caused by the connector plugins, because we see the same behaviour for both Debezium and S3 connectors. Also in debug logs I can see that Debezium is correctly returning a task configuration from the Connector.taskConfigs() method.
Connector configuration examples
Debezium:
{ "name": "qa-mongodb-comp-converter-task|1", "config": { "connector.class": "io.debezium.connector.mongodb.MongoDbConnector", "mongodb.hosts": "mongodb-qa-001:27017,mongodb-qa-002:27017,mongodb-qa-003:27017", "mongodb.name": "qa-debezium-comp", "mongodb.ssl.enabled": true, "collection.whitelist": "converter[.]task", "tombstones.on.delete": true } }
S3 Connector:
{ "name": "qa-s3-sink-task|1", "config": { "connector.class": "io.confluent.connect.s3.S3SinkConnector", "topics": "qa-debezium-comp.converter.task", "topics.dir": "data/env/qa", "s3.region": "eu-west-1", "s3.bucket.name": "<bucket-name>", "flush.size": "15000", "rotate.interval.ms": "3600000", "storage.class": "io.confluent.connect.s3.storage.S3Storage", "format.class": "custom.kafka.connect.s3.format.plaintext.PlaintextFormat", "schema.generator.class": "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator", "partitioner.class": "io.confluent.connect.storage.partitioner.DefaultPartitioner", "schema.compatibility": "NONE", "key.converter": "org.apache.kafka.connect.json.JsonConverter", "value.converter": "org.apache.kafka.connect.json.JsonConverter", "key.converter.schemas.enable": false, "value.converter.schemas.enable": false, "transforms": "ExtractDocument", "transforms.ExtractDocument.type":"custom.kafka.connect.transforms.ExtractDocument$Value" } }
The connectors are created using curl: curl -X POST -H "Content-Type: application/json" --data @<json_file> http:/<connect_host>:10083/connectors
Attachments
Attachments
Issue Links
- is duplicated by
-
KAFKA-9805 Running MirrorMaker in a Connect cluster,but the task not running
- Resolved
-
KAFKA-13253 Kafka Connect losing task (re)configuration when connector name has special characters
- Resolved
- relates to
-
KAFKA-9805 Running MirrorMaker in a Connect cluster,but the task not running
- Resolved
- links to