Standard Output [2019-07-10 00:04:36,523] INFO Created server with tickTime 800 minSessionTimeout 1600 maxSessionTimeout 16000 datadir /tmp/kafka-13166566021291859084/version-2 snapdir /tmp/kafka-3055410946196954549/version-2 (org.apache.zookeeper.server.ZooKeeperServer:174) [2019-07-10 00:04:36,523] INFO binding to port /127.0.0.1:0 (org.apache.zookeeper.server.NIOServerCnxnFactory:89) [2019-07-10 00:04:36,528] INFO SessionTrackerImpl exited loop! (org.apache.zookeeper.server.SessionTrackerImpl:163) [2019-07-10 00:04:36,530] INFO Initiating client connection, connectString=127.0.0.1:33883 sessionTimeout=6000 watcher=kafka.zookeeper.ZooKeeperClient$ZooKeeperClientWatcher$@4d71ec38 (org.apache.zookeeper.ZooKeeper:442) [2019-07-10 00:04:36,531] INFO Opening socket connection to server localhost/127.0.0.1:33883. Will not attempt to authenticate using SASL (unknown error) (org.apache.zookeeper.ClientCnxn:1025) [2019-07-10 00:04:36,531] WARN [simple-source|task-2] [Producer clientId=connector-producer-simple-source-2] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:36,532] INFO Socket connection established to localhost/127.0.0.1:33883, initiating session (org.apache.zookeeper.ClientCnxn:879) [2019-07-10 00:04:36,532] INFO Accepted socket connection from /127.0.0.1:45016 (org.apache.zookeeper.server.NIOServerCnxnFactory:222) [2019-07-10 00:04:36,532] INFO Client attempting to establish new session at /127.0.0.1:45016 (org.apache.zookeeper.server.ZooKeeperServer:949) [2019-07-10 00:04:36,533] INFO Creating new log file: log.1 (org.apache.zookeeper.server.persistence.FileTxnLog:216) [2019-07-10 00:04:36,534] INFO Established session 0x1054986967c0000 with negotiated timeout 6000 for client /127.0.0.1:45016 (org.apache.zookeeper.server.ZooKeeperServer:694) [2019-07-10 00:04:36,534] INFO Session establishment complete on server localhost/127.0.0.1:33883, sessionid = 0x1054986967c0000, negotiated timeout = 6000 (org.apache.zookeeper.ClientCnxn:1299) [2019-07-10 00:04:36,537] INFO Got user-level KeeperException when processing sessionid:0x1054986967c0000 type:create cxid:0x2 zxid:0x3 txntype:-1 reqpath:n/a Error Path:/brokers Error:KeeperErrorCode = NoNode for /brokers (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-10 00:04:36,539] INFO Got user-level KeeperException when processing sessionid:0x1054986967c0000 type:create cxid:0x6 zxid:0x7 txntype:-1 reqpath:n/a Error Path:/config Error:KeeperErrorCode = NoNode for /config (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-10 00:04:36,541] INFO Got user-level KeeperException when processing sessionid:0x1054986967c0000 type:create cxid:0x9 zxid:0xa txntype:-1 reqpath:n/a Error Path:/admin Error:KeeperErrorCode = NoNode for /admin (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-10 00:04:36,546] INFO Got user-level KeeperException when processing sessionid:0x1054986967c0000 type:create cxid:0x15 zxid:0x15 txntype:-1 reqpath:n/a Error Path:/cluster Error:KeeperErrorCode = NoNode for /cluster (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-10 00:04:36,547] WARN No meta.properties file under dir /tmp/junit2197542855963278499/junit2716825144168849176/meta.properties (kafka.server.BrokerMetadataCheckpoint:70) [2019-07-10 00:04:36,601] WARN [error-conn|task-0] [Producer clientId=connector-dlq-producer-error-conn-0] Connection to node 0 (localhost/127.0.0.1:39055) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:36,657] WARN [simple-source|task-3] [Producer clientId=connector-producer-simple-source-3] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:36,717] WARN [simple-source|task-3] [Producer clientId=connector-producer-simple-source-3] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:36,743] WARN No meta.properties file under dir /tmp/junit2197542855963278499/junit2716825144168849176/meta.properties (kafka.server.BrokerMetadataCheckpoint:70) [2019-07-10 00:04:36,817] WARN [simple-source|task-0] [Producer clientId=connector-producer-simple-source-0] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:36,817] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-10 00:04:36,818] INFO Kafka commitId: 3352b1493f680a96 (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-10 00:04:36,818] INFO Kafka startTimeMs: 1562717076522 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-10 00:04:36,819] INFO Got user-level KeeperException when processing sessionid:0x1054986967c0000 type:multi cxid:0x38 zxid:0x1c txntype:-1 reqpath:n/a aborting remaining multi ops. Error Path:/admin/preferred_replica_election Error:KeeperErrorCode = NoNode for /admin/preferred_replica_election (org.apache.zookeeper.server.PrepRequestProcessor:596) [2019-07-10 00:04:36,819] INFO ProducerConfig values: acks = 1 batch.size = 16384 bootstrap.servers = [localhost:41027] buffer.memory = 33554432 client.dns.lookup = default client.id = compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:347) [2019-07-10 00:04:36,822] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-10 00:04:36,822] INFO Kafka commitId: 3352b1493f680a96 (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-10 00:04:36,823] INFO Kafka startTimeMs: 1562717076822 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-10 00:04:36,823] INFO Starting Connect cluster 'connect-cluster' with 3 workers (org.apache.kafka.connect.util.clusters.EmbeddedConnectCluster:208) [2019-07-10 00:04:36,823] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed:90) [2019-07-10 00:04:36,929] INFO [Producer clientId=producer-18] Cluster ID: UlRcuZFpRTOg7gH7IDjAFw (org.apache.kafka.clients.Metadata:266) [2019-07-10 00:04:37,372] WARN [simple-source|task-2] [Producer clientId=connector-producer-simple-source-2] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:37,422] WARN [simple-source|task-1] [Producer clientId=connector-producer-simple-source-1] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:37,522] WARN [error-conn|task-0] [Producer clientId=connector-dlq-producer-error-conn-0] Connection to node 0 (localhost/127.0.0.1:39055) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:37,542] DEBUG Skipping class org.apache.kafka.connect.source.SourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,543] DEBUG Skipping class org.apache.kafka.connect.runtime.standalone.StandaloneHerderTest$BogusSinkConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,543] DEBUG Skipping class org.apache.kafka.connect.runtime.WorkerConnectorTest$TestConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,544] DEBUG Skipping class org.apache.kafka.connect.runtime.AbstractHerderTest$BogusSourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,544] DEBUG Skipping class org.apache.kafka.connect.runtime.standalone.StandaloneHerderTest$BogusSourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,544] DEBUG Skipping class org.apache.kafka.connect.runtime.ConnectorConfigTest$TestConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,544] DEBUG Skipping class org.apache.kafka.connect.runtime.distributed.DistributedHerderTest$BogusSourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,545] DEBUG Skipping class org.apache.kafka.connect.sink.SinkConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,545] DEBUG Skipping class org.apache.kafka.connect.converters.NumberConverter as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,546] DEBUG Skipping class org.apache.kafka.connect.converters.NumberConverter as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,547] DEBUG Skipping class org.apache.kafka.connect.transforms.ExtractField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,547] DEBUG Skipping class org.apache.kafka.connect.transforms.TimestampConverter as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,548] DEBUG Skipping class org.apache.kafka.connect.transforms.Cast as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,548] DEBUG Skipping class org.apache.kafka.connect.transforms.InsertField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,548] DEBUG Skipping class org.apache.kafka.connect.transforms.ReplaceField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,549] DEBUG Skipping class org.apache.kafka.connect.transforms.Flatten as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,549] DEBUG Skipping class org.apache.kafka.connect.transforms.SetSchemaMetadata as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,549] DEBUG Skipping class org.apache.kafka.connect.transforms.HoistField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,549] DEBUG Skipping class org.apache.kafka.connect.transforms.MaskField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-10 00:04:37,551] INFO Registered loader: jdk.internal.loader.ClassLoaders$AppClassLoader@2c13da15 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:245) [2019-07-10 00:04:37,552] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,552] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,552] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,552] INFO Added plugin 'org.apache.kafka.connect.runtime.TestSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,552] INFO Added plugin 'org.apache.kafka.connect.runtime.WorkerTest$WorkerTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,553] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,553] INFO Added plugin 'org.apache.kafka.connect.integration.MonitorableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,553] INFO Added plugin 'org.apache.kafka.connect.runtime.TestSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,553] INFO Added plugin 'org.apache.kafka.connect.integration.MonitorableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,554] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,554] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,554] INFO Added plugin 'org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResourceTest$ConnectorPluginsResourceTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,554] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,554] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,555] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,555] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,555] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,555] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,556] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,556] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,556] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestInternalConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,556] INFO Added plugin 'org.apache.kafka.connect.runtime.WorkerTest$TestConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,556] INFO Added plugin 'org.apache.kafka.connect.runtime.WorkerTest$TestConfigurableConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,557] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,557] INFO Added plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,557] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,557] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,558] INFO Added plugin 'org.apache.kafka.connect.runtime.AbstractHerderTest$SampleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,558] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,558] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,558] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,558] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,559] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,559] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,559] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,559] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,560] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,560] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,560] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,560] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,560] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,561] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,561] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,561] INFO Added plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyPassthrough' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,561] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,562] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,562] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,562] INFO Added plugin 'org.apache.kafka.connect.integration.ErrorHandlingIntegrationTest$FaultyPassthrough' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,562] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,562] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,563] INFO Added plugin 'org.apache.kafka.connect.runtime.ConnectorConfigTest$SimpleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,563] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,563] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,563] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestConnectRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,564] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,564] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,564] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-10 00:04:37,565] INFO Added aliases 'MonitorableSinkConnector' and 'MonitorableSink' to plugin 'org.apache.kafka.connect.integration.MonitorableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,565] INFO Added aliases 'MonitorableSourceConnector' and 'MonitorableSource' to plugin 'org.apache.kafka.connect.integration.MonitorableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,565] INFO Added aliases 'TestSinkConnector' and 'TestSink' to plugin 'org.apache.kafka.connect.runtime.TestSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,565] INFO Added aliases 'TestSourceConnector' and 'TestSource' to plugin 'org.apache.kafka.connect.runtime.TestSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,565] INFO Added aliases 'WorkerTestConnector' and 'WorkerTest' to plugin 'org.apache.kafka.connect.runtime.WorkerTest$WorkerTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,566] INFO Added aliases 'ConnectorPluginsResourceTestConnector' and 'ConnectorPluginsResourceTest' to plugin 'org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResourceTest$ConnectorPluginsResourceTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,566] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,566] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,566] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,567] INFO Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,567] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,567] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,567] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,567] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,568] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,568] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,568] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,568] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,569] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,569] INFO Added aliases 'FaultyConverter' and 'Faulty' to plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,569] INFO Added aliases 'TestConfigurableConverter' and 'TestConfigurable' to plugin 'org.apache.kafka.connect.runtime.WorkerTest$TestConfigurableConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,569] INFO Added aliases 'TestInternalConverter' and 'TestInternal' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestInternalConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,569] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,570] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,570] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,570] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,570] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,571] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,571] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,571] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,571] INFO Added aliases 'FaultyConverter' and 'Faulty' to plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,572] INFO Added alias 'TestHeaderConverter' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-10 00:04:37,572] INFO Added aliases 'TestInternalConverter' and 'TestInternal' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestInternalConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,572] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-10 00:04:37,572] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,573] INFO Added aliases 'SampleTransformation' and 'Sample' to plugin 'org.apache.kafka.connect.runtime.AbstractHerderTest$SampleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,573] INFO Added aliases 'SimpleTransformation' and 'Simple' to plugin 'org.apache.kafka.connect.runtime.ConnectorConfigTest$SimpleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,573] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-10 00:04:37,573] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-10 00:04:37,574] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-10 00:04:37,574] INFO Added aliases 'TestConnectRestExtension' and 'Test' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestConnectRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,574] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,574] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,574] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-10 00:04:37,575] INFO DistributedConfig values: access.control.allow.methods = access.control.allow.origin = bootstrap.servers = [localhost:41027] client.dns.lookup = default client.id = config.providers = [] config.storage.replication.factor = 1 config.storage.topic = connect-config-topic-connect-cluster connect.protocol = compatible connections.max.idle.ms = 540000 connector.client.config.override.policy = None group.id = connect-integration-test-connect-cluster header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter heartbeat.interval.ms = 3000 internal.key.converter = class org.apache.kafka.connect.json.JsonConverter internal.value.converter = class org.apache.kafka.connect.json.JsonConverter key.converter = class org.apache.kafka.connect.storage.StringConverter listeners = null metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 5000 offset.flush.timeout.ms = 5000 offset.storage.partitions = 25 offset.storage.replication.factor = 1 offset.storage.topic = connect-offset-topic-connect-cluster plugin.path = null rebalance.timeout.ms = 60000 receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 40000 rest.advertised.host.name = null rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] rest.host.name = localhost rest.port = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI scheduled.rebalance.max.delay.ms = 300000 security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS status.storage.partitions = 5 status.storage.replication.factor = 1 status.storage.topic = connect-storage-topic-connect-cluster task.shutdown.graceful.timeout.ms = 5000 value.converter = class org.apache.kafka.connect.storage.StringConverter worker.sync.timeout.ms = 3000 worker.unsync.backoff.ms = 300000 (org.apache.kafka.connect.runtime.distributed.DistributedConfig:347) [2019-07-10 00:04:37,575] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:43) [2019-07-10 00:04:37,576] INFO AdminClientConfig values: bootstrap.servers = [localhost:41027] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2019-07-10 00:04:37,577] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,577] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,578] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,578] WARN The configuration 'rest.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,578] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,578] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,579] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,579] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,579] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,579] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,579] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,580] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,580] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-10 00:04:37,580] INFO Kafka commitId: 3352b1493f680a96 (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-10 00:04:37,580] INFO Kafka startTimeMs: 1562717077580 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-10 00:04:37,581] DEBUG Looking up Kafka cluster ID (org.apache.kafka.connect.util.ConnectUtils:50) [2019-07-10 00:04:37,581] DEBUG Fetching Kafka cluster ID (org.apache.kafka.connect.util.ConnectUtils:57) [2019-07-10 00:04:37,585] INFO Kafka cluster ID: UlRcuZFpRTOg7gH7IDjAFw (org.apache.kafka.connect.util.ConnectUtils:59) [2019-07-10 00:04:37,586] DEBUG Kafka cluster ID: UlRcuZFpRTOg7gH7IDjAFw (org.apache.kafka.connect.cli.ConnectDistributed:96) [2019-07-10 00:04:37,587] INFO Added connector for http://localhost:0 (org.apache.kafka.connect.runtime.rest.RestServer:124) [2019-07-10 00:04:37,587] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:168) [2019-07-10 00:04:37,588] INFO jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 11.0.1+13-LTS (org.eclipse.jetty.server.Server:370) [2019-07-10 00:04:37,592] INFO Started http_localhost0@2f637be4{HTTP/1.1,[http/1.1]}{localhost:42321} (org.eclipse.jetty.server.AbstractConnector:292) [2019-07-10 00:04:37,592] INFO Started @52550ms (org.eclipse.jetty.server.Server:410) [2019-07-10 00:04:37,592] INFO Advertised URI: http://localhost:42321/ (org.apache.kafka.connect.runtime.rest.RestServer:285) [2019-07-10 00:04:37,593] INFO REST server listening at http://localhost:42321/, advertising URL http://localhost:42321/ (org.apache.kafka.connect.runtime.rest.RestServer:183) [2019-07-10 00:04:37,593] INFO Advertised URI: http://localhost:42321/ (org.apache.kafka.connect.runtime.rest.RestServer:285) [2019-07-10 00:04:37,593] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy:45) [2019-07-10 00:04:37,594] DEBUG Registering Connect metrics with JMX for worker 'localhost:42321' (org.apache.kafka.connect.runtime.ConnectMetrics:83) [2019-07-10 00:04:37,594] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-10 00:04:37,594] INFO Kafka commitId: 3352b1493f680a96 (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-10 00:04:37,594] INFO Kafka startTimeMs: 1562717077594 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-10 00:04:37,595] DEBUG Configuring the key converter with configuration keys: [] (org.apache.kafka.connect.runtime.isolation.Plugins:254) [2019-07-10 00:04:37,595] INFO JsonConverterConfig values: converter.type = key schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) [2019-07-10 00:04:37,596] DEBUG Configuring the value converter with configuration keys: [] (org.apache.kafka.connect.runtime.isolation.Plugins:254) [2019-07-10 00:04:37,596] INFO JsonConverterConfig values: converter.type = value schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) [2019-07-10 00:04:37,598] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-10 00:04:37,598] INFO Kafka commitId: 3352b1493f680a96 (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-10 00:04:37,598] INFO Kafka startTimeMs: 1562717077597 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-10 00:04:37,598] DEBUG [Worker clientId=connect-6, groupId=connect-integration-test-connect-cluster] Connect group member created (org.apache.kafka.connect.runtime.distributed.WorkerGroupMember:139) [2019-07-10 00:04:37,598] DEBUG Kafka Connect instance created (org.apache.kafka.connect.runtime.Connect:42) [2019-07-10 00:04:37,599] INFO Kafka Connect distributed worker initialization took 775ms (org.apache.kafka.connect.cli.ConnectDistributed:128) [2019-07-10 00:04:37,599] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50) [2019-07-10 00:04:37,599] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:187) [2019-07-10 00:04:37,599] INFO [Worker clientId=connect-6, groupId=connect-integration-test-connect-cluster] Herder starting (org.apache.kafka.connect.runtime.distributed.DistributedHerder:238) [2019-07-10 00:04:37,600] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:182) [2019-07-10 00:04:37,600] INFO Starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:108) [2019-07-10 00:04:37,600] INFO Starting KafkaBasedLog with topic connect-offset-topic-connect-cluster (org.apache.kafka.connect.util.KafkaBasedLog:125) [2019-07-10 00:04:37,600] DEBUG Creating admin client to manage Connect internal offset topic (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:97) [2019-07-10 00:04:37,600] INFO AdminClientConfig values: bootstrap.servers = [localhost:41027] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2019-07-10 00:04:37,602] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,602] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,602] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,602] WARN The configuration 'rest.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,602] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:365) [2019-07-10 00:04:37,602] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,603] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:370) [2019-07-10 00:04:37,603] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,603] INFO node0 Scavenging every 600000ms (org.eclipse.jetty.server.session:149) [2019-07-10 00:04:37,603] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,604] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,604] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,604] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,604] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,605] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-10 00:04:37,605] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-10 00:04:37,605] INFO Kafka commitId: 3352b1493f680a96 (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-10 00:04:37,605] INFO Kafka startTimeMs: 1562717077605 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-10 00:04:37,612] INFO Got user-level KeeperException when processing sessionid:0x1054986967c0000 type:setData cxid:0x3e zxid:0x1d txntype:-1 reqpath:n/a Error Path:/config/topics/connect-offset-topic-connect-cluster Error:KeeperErrorCode = NoNode for /config/topics/connect-offset-topic-connect-cluster (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-10 00:04:37,705] WARN [simple-source|task-0] [Producer clientId=connector-producer-simple-source-0] Connection to node 0 (localhost/127.0.0.1:39387) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-10 00:04:37,707] INFO Started o.e.j.s.ServletContextHandler@5316af68{/,null,AVAILABLE} (org.eclipse.j ...[truncated 122812112 chars]... als.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,708] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91836}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91836, keySchema=Schema{STRING}, value=value-simple-conn-2-91836, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,708] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65150}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65150, keySchema=Schema{STRING}, value=value-simple-conn-1-65150, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,708] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,708] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,708] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91844}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91844, keySchema=Schema{STRING}, value=value-simple-conn-2-91844, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,708] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65153}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65153, keySchema=Schema{STRING}, value=value-simple-conn-1-65153, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,708] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,709] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,709] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91845}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91845, keySchema=Schema{STRING}, value=value-simple-conn-2-91845, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,709] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65154}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65154, keySchema=Schema{STRING}, value=value-simple-conn-1-65154, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,709] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,709] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,709] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91846}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91846, keySchema=Schema{STRING}, value=value-simple-conn-2-91846, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,710] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65155}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65155, keySchema=Schema{STRING}, value=value-simple-conn-1-65155, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,710] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,710] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,710] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91847}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91847, keySchema=Schema{STRING}, value=value-simple-conn-2-91847, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,710] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65156}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65156, keySchema=Schema{STRING}, value=value-simple-conn-1-65156, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,710] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,710] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,711] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91851}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91851, keySchema=Schema{STRING}, value=value-simple-conn-2-91851, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,711] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65157}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65157, keySchema=Schema{STRING}, value=value-simple-conn-1-65157, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,711] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,711] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,711] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91852}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91852, keySchema=Schema{STRING}, value=value-simple-conn-2-91852, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,711] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65161}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65161, keySchema=Schema{STRING}, value=value-simple-conn-1-65161, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,711] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,712] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,712] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91854}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91854, keySchema=Schema{STRING}, value=value-simple-conn-2-91854, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,712] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65162}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65162, keySchema=Schema{STRING}, value=value-simple-conn-1-65162, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,712] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,712] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,712] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91865}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91865, keySchema=Schema{STRING}, value=value-simple-conn-2-91865, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,713] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65167}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65167, keySchema=Schema{STRING}, value=value-simple-conn-1-65167, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,713] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,713] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,713] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91866}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91866, keySchema=Schema{STRING}, value=value-simple-conn-2-91866, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,713] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65171}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65171, keySchema=Schema{STRING}, value=value-simple-conn-1-65171, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,713] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,713] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,714] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91871}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91871, keySchema=Schema{STRING}, value=value-simple-conn-2-91871, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,714] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65172}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65172, keySchema=Schema{STRING}, value=value-simple-conn-1-65172, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,714] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,714] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,714] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91874}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91874, keySchema=Schema{STRING}, value=value-simple-conn-2-91874, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,714] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65173}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65173, keySchema=Schema{STRING}, value=value-simple-conn-1-65173, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,714] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,715] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,715] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91875}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91875, keySchema=Schema{STRING}, value=value-simple-conn-2-91875, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,715] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65174}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65174, keySchema=Schema{STRING}, value=value-simple-conn-1-65174, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,715] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,715] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,715] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91878}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91878, keySchema=Schema{STRING}, value=value-simple-conn-2-91878, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,715] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65175}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65175, keySchema=Schema{STRING}, value=value-simple-conn-1-65175, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,716] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,716] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,716] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91880}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91880, keySchema=Schema{STRING}, value=value-simple-conn-2-91880, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,716] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65179}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65179, keySchema=Schema{STRING}, value=value-simple-conn-1-65179, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,716] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,716] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,717] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91881}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91881, keySchema=Schema{STRING}, value=value-simple-conn-2-91881, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,717] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65180}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65180, keySchema=Schema{STRING}, value=value-simple-conn-1-65180, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,717] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,717] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,717] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91886}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91886, keySchema=Schema{STRING}, value=value-simple-conn-2-91886, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,717] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65187}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65187, keySchema=Schema{STRING}, value=value-simple-conn-1-65187, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,717] INFO shutting down (org.apache.zookeeper.server.ZooKeeperServer:502) [2019-07-10 00:05:52,717] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,718] INFO Shutting down (org.apache.zookeeper.server.SessionTrackerImpl:226) [2019-07-10 00:05:52,718] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,718] INFO Shutting down (org.apache.zookeeper.server.PrepRequestProcessor:769) [2019-07-10 00:05:52,718] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91888}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91888, keySchema=Schema{STRING}, value=value-simple-conn-2-91888, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,719] INFO PrepRequestProcessor exited loop! (org.apache.zookeeper.server.PrepRequestProcessor:144) [2019-07-10 00:05:52,719] INFO Shutting down (org.apache.zookeeper.server.SyncRequestProcessor:208) [2019-07-10 00:05:52,719] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65188}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65188, keySchema=Schema{STRING}, value=value-simple-conn-1-65188, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,720] INFO SyncRequestProcessor exited! (org.apache.zookeeper.server.SyncRequestProcessor:186) [2019-07-10 00:05:52,719] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,720] INFO shutdown of request processor complete (org.apache.zookeeper.server.FinalRequestProcessor:430) [2019-07-10 00:05:52,720] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,720] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91892}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91892, keySchema=Schema{STRING}, value=value-simple-conn-2-91892, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,720] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65190}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65190, keySchema=Schema{STRING}, value=value-simple-conn-1-65190, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,721] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,721] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,721] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91897}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91897, keySchema=Schema{STRING}, value=value-simple-conn-2-91897, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,721] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65194}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65194, keySchema=Schema{STRING}, value=value-simple-conn-1-65194, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,721] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,721] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,722] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91898}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91898, keySchema=Schema{STRING}, value=value-simple-conn-2-91898, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,722] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65196}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65196, keySchema=Schema{STRING}, value=value-simple-conn-1-65196, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,722] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,722] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,722] INFO NIOServerCnxn factory exited run method (org.apache.zookeeper.server.NIOServerCnxnFactory:249) [2019-07-10 00:05:52,722] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91902}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91902, keySchema=Schema{STRING}, value=value-simple-conn-2-91902, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,722] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65197}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65197, keySchema=Schema{STRING}, value=value-simple-conn-1-65197, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,723] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,723] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,723] DEBUG [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-2}, sourceOffset={saved=91907}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-2-91907, keySchema=Schema{STRING}, value=value-simple-conn-2-91907, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,723] DEBUG [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-1}, sourceOffset={saved=65200}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-1-65200, keySchema=Schema{STRING}, value=value-simple-conn-1-65200, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-10 00:05:52,723] ERROR [simple-conn|task-2] WorkerSourceTask{id=simple-conn-2} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-10 00:05:52,723] ERROR [simple-conn|task-1] WorkerSourceTask{id=simple-conn-1} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) Standard Error Jul 10, 2019 12:04:37 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Jul 10, 2019 12:04:37 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Jul 10, 2019 12:04:37 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Jul 10, 2019 12:04:37 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. Jul 10, 2019 12:04:38 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Jul 10, 2019 12:04:38 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Jul 10, 2019 12:04:38 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Jul 10, 2019 12:04:38 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. Jul 10, 2019 12:04:39 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Jul 10, 2019 12:04:39 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Jul 10, 2019 12:04:39 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Jul 10, 2019 12:04:39 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. Jul 10, 2019 12:05:24 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Jul 10, 2019 12:05:24 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Jul 10, 2019 12:05:24 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Jul 10, 2019 12:05:24 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. Jul 10, 2019 12:05:25 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Jul 10, 2019 12:05:25 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Jul 10, 2019 12:05:25 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Jul 10, 2019 12:05:25 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation. Jul 10, 2019 12:05:26 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored. Jul 10, 2019 12:05:26 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored. Jul 10, 2019 12:05:26 AM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored. Jul 10, 2019 12:05:26 AM org.glassfish.jersey.internal.Errors logErrors WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation. WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation. WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.