[2019-07-05 11:55:44,512] INFO Created server with tickTime 800 minSessionTimeout 1600 maxSessionTimeout 16000 datadir /tmp/kafka-6410362488660751805/version-2 snapdir /tmp/kafka-12548392351584248291/version-2 (org.apache.zookeeper.server.ZooKeeperServer:174) [2019-07-05 11:55:44,513] INFO binding to port /127.0.0.1:0 (org.apache.zookeeper.server.NIOServerCnxnFactory:89) [2019-07-05 11:55:44,521] INFO Initiating client connection, connectString=127.0.0.1:42325 sessionTimeout=6000 watcher=kafka.zookeeper.ZooKeeperClient$ZooKeeperClientWatcher$@782ae0bb (org.apache.zookeeper.ZooKeeper:442) [2019-07-05 11:55:44,523] INFO Opening socket connection to server localhost/127.0.0.1:42325. Will not attempt to authenticate using SASL (unknown error) (org.apache.zookeeper.ClientCnxn:1025) [2019-07-05 11:55:44,523] INFO Socket connection established to localhost/127.0.0.1:42325, initiating session (org.apache.zookeeper.ClientCnxn:879) [2019-07-05 11:55:44,523] INFO Accepted socket connection from /127.0.0.1:38058 (org.apache.zookeeper.server.NIOServerCnxnFactory:222) [2019-07-05 11:55:44,524] INFO Client attempting to establish new session at /127.0.0.1:38058 (org.apache.zookeeper.server.ZooKeeperServer:949) [2019-07-05 11:55:44,525] INFO Creating new log file: log.1 (org.apache.zookeeper.server.persistence.FileTxnLog:216) [2019-07-05 11:55:44,526] INFO Established session 0x105249c73130000 with negotiated timeout 6000 for client /127.0.0.1:38058 (org.apache.zookeeper.server.ZooKeeperServer:694) [2019-07-05 11:55:44,526] INFO Session establishment complete on server localhost/127.0.0.1:42325, sessionid = 0x105249c73130000, negotiated timeout = 6000 (org.apache.zookeeper.ClientCnxn:1299) [2019-07-05 11:55:44,531] INFO Got user-level KeeperException when processing sessionid:0x105249c73130000 type:create cxid:0x2 zxid:0x3 txntype:-1 reqpath:n/a Error Path:/brokers Error:KeeperErrorCode = NoNode for /brokers (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-05 11:55:44,534] INFO Got user-level KeeperException when processing sessionid:0x105249c73130000 type:create cxid:0x6 zxid:0x7 txntype:-1 reqpath:n/a Error Path:/config Error:KeeperErrorCode = NoNode for /config (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-05 11:55:44,536] INFO Got user-level KeeperException when processing sessionid:0x105249c73130000 type:create cxid:0x9 zxid:0xa txntype:-1 reqpath:n/a Error Path:/admin Error:KeeperErrorCode = NoNode for /admin (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-05 11:55:44,545] INFO Got user-level KeeperException when processing sessionid:0x105249c73130000 type:create cxid:0x15 zxid:0x15 txntype:-1 reqpath:n/a Error Path:/cluster Error:KeeperErrorCode = NoNode for /cluster (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-05 11:55:44,547] WARN [simple-source|task-0] [Producer clientId=connector-producer-simple-source-0] Connection to node 0 (localhost/127.0.0.1:37009) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:44,547] WARN No meta.properties file under dir /tmp/junit4594361372413295796/junit16389345008413442308/meta.properties (kafka.server.BrokerMetadataCheckpoint:70) [2019-07-05 11:55:44,641] WARN [simple-source|task-3] [Producer clientId=connector-producer-simple-source-3] Connection to node 0 (localhost/127.0.0.1:37009) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:44,706] WARN No meta.properties file under dir /tmp/junit4594361372413295796/junit16389345008413442308/meta.properties (kafka.server.BrokerMetadataCheckpoint:70) [2019-07-05 11:55:44,774] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-05 11:55:44,775] INFO Kafka commitId: af1bf7c90963cfba (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-05 11:55:44,775] INFO Kafka startTimeMs: 1562327744511 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-05 11:55:44,776] INFO ProducerConfig values: acks = 1 batch.size = 16384 bootstrap.servers = [localhost:37774] buffer.memory = 33554432 client.dns.lookup = default client.id = compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:347) [2019-07-05 11:55:44,778] INFO Got user-level KeeperException when processing sessionid:0x105249c73130000 type:multi cxid:0x38 zxid:0x1c txntype:-1 reqpath:n/a aborting remaining multi ops. Error Path:/admin/preferred_replica_election Error:KeeperErrorCode = NoNode for /admin/preferred_replica_election (org.apache.zookeeper.server.PrepRequestProcessor:596) [2019-07-05 11:55:44,780] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-05 11:55:44,780] INFO Kafka commitId: af1bf7c90963cfba (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-05 11:55:44,780] INFO Kafka startTimeMs: 1562327744780 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-05 11:55:44,781] INFO Starting Connect cluster 'connect-cluster' with 3 workers (org.apache.kafka.connect.util.clusters.EmbeddedConnectCluster:208) [2019-07-05 11:55:44,781] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed:90) [2019-07-05 11:55:44,783] INFO SessionTrackerImpl exited loop! (org.apache.zookeeper.server.SessionTrackerImpl:163) [2019-07-05 11:55:44,882] INFO [Producer clientId=producer-81] Cluster ID: JmoHbEUISpiH_d7_Pn5qgw (org.apache.kafka.clients.Metadata:266) [2019-07-05 11:55:44,905] WARN [simple-source|task-2] [Producer clientId=connector-producer-simple-source-2] Connection to node 0 (localhost/127.0.0.1:37009) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:45,256] WARN [simple-source|task-1] [Producer clientId=connector-producer-simple-source-1] Connection to node 0 (localhost/127.0.0.1:37009) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:45,406] WARN [error-conn|task-0] [Producer clientId=connector-dlq-producer-error-conn-0] Connection to node 0 (localhost/127.0.0.1:34318) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:45,506] WARN [simple-source|task-3] [Producer clientId=connector-producer-simple-source-3] Connection to node 0 (localhost/127.0.0.1:37009) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:45,682] DEBUG Skipping class org.apache.kafka.connect.sink.SinkConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,682] DEBUG Skipping class org.apache.kafka.connect.runtime.AbstractHerderTest$BogusSourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,683] DEBUG Skipping class org.apache.kafka.connect.source.SourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,684] DEBUG Skipping class org.apache.kafka.connect.runtime.distributed.DistributedHerderTest$BogusSourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,684] DEBUG Skipping class org.apache.kafka.connect.runtime.standalone.StandaloneHerderTest$BogusSinkConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,685] DEBUG Skipping class org.apache.kafka.connect.runtime.WorkerConnectorTest$TestConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,685] DEBUG Skipping class org.apache.kafka.connect.runtime.standalone.StandaloneHerderTest$BogusSourceConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,686] DEBUG Skipping class org.apache.kafka.connect.runtime.ConnectorConfigTest$TestConnector as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,686] DEBUG Skipping class org.apache.kafka.connect.converters.NumberConverter as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,687] DEBUG Skipping class org.apache.kafka.connect.converters.NumberConverter as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,689] DEBUG Skipping class org.apache.kafka.connect.transforms.MaskField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,689] DEBUG Skipping class org.apache.kafka.connect.transforms.InsertField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,689] DEBUG Skipping class org.apache.kafka.connect.transforms.SetSchemaMetadata as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,690] DEBUG Skipping class org.apache.kafka.connect.transforms.TimestampConverter as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,690] DEBUG Skipping class org.apache.kafka.connect.transforms.Cast as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,690] DEBUG Skipping class org.apache.kafka.connect.transforms.Flatten as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,691] DEBUG Skipping class org.apache.kafka.connect.transforms.HoistField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,691] DEBUG Skipping class org.apache.kafka.connect.transforms.ExtractField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,692] DEBUG Skipping class org.apache.kafka.connect.transforms.ReplaceField as it is not concrete implementation (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:333) [2019-07-05 11:55:45,695] INFO Registered loader: jdk.internal.loader.ClassLoaders$AppClassLoader@2c13da15 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:245) [2019-07-05 11:55:45,695] INFO Added plugin 'org.apache.kafka.connect.runtime.WorkerTest$WorkerTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,695] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,696] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,696] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,696] INFO Added plugin 'org.apache.kafka.connect.integration.MonitorableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,697] INFO Added plugin 'org.apache.kafka.connect.runtime.TestSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,697] INFO Added plugin 'org.apache.kafka.connect.integration.MonitorableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,697] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,698] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,698] INFO Added plugin 'org.apache.kafka.connect.runtime.TestSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,699] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,699] INFO Added plugin 'org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResourceTest$ConnectorPluginsResourceTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,699] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,700] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,700] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,700] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,701] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestInternalConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,701] INFO Added plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,701] INFO Added plugin 'org.apache.kafka.connect.runtime.WorkerTest$TestConfigurableConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,702] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,702] INFO Added plugin 'org.apache.kafka.connect.runtime.WorkerTest$TestConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,702] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,703] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,703] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,703] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,704] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,704] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,705] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,705] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,705] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,706] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,706] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,707] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,707] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,708] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,708] INFO Added plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyPassthrough' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,708] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,709] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,709] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,710] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,710] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,707] WARN [simple-source|task-0] [Producer clientId=connector-producer-simple-source-0] Connection to node 0 (localhost/127.0.0.1:37009) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient:748) [2019-07-05 11:55:45,711] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,711] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,712] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,712] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,713] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,713] INFO Added plugin 'org.apache.kafka.connect.runtime.AbstractHerderTest$SampleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,713] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,714] INFO Added plugin 'org.apache.kafka.connect.runtime.ConnectorConfigTest$SimpleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,714] INFO Added plugin 'org.apache.kafka.connect.integration.ErrorHandlingIntegrationTest$FaultyPassthrough' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,714] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,715] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,715] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,716] INFO Added plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestConnectRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,716] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,717] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,717] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:174) [2019-07-05 11:55:45,718] INFO Added aliases 'MonitorableSinkConnector' and 'MonitorableSink' to plugin 'org.apache.kafka.connect.integration.MonitorableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,718] INFO Added aliases 'MonitorableSourceConnector' and 'MonitorableSource' to plugin 'org.apache.kafka.connect.integration.MonitorableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,719] INFO Added aliases 'TestSinkConnector' and 'TestSink' to plugin 'org.apache.kafka.connect.runtime.TestSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,719] INFO Added aliases 'TestSourceConnector' and 'TestSource' to plugin 'org.apache.kafka.connect.runtime.TestSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,720] INFO Added aliases 'WorkerTestConnector' and 'WorkerTest' to plugin 'org.apache.kafka.connect.runtime.WorkerTest$WorkerTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,720] INFO Added aliases 'ConnectorPluginsResourceTestConnector' and 'ConnectorPluginsResourceTest' to plugin 'org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResourceTest$ConnectorPluginsResourceTestConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,721] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,721] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,721] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,722] INFO Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,722] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,723] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,723] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,724] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,724] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,724] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,725] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,725] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,726] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,726] INFO Added aliases 'FaultyConverter' and 'Faulty' to plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,726] INFO Added aliases 'TestConfigurableConverter' and 'TestConfigurable' to plugin 'org.apache.kafka.connect.runtime.WorkerTest$TestConfigurableConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,727] INFO Added aliases 'TestInternalConverter' and 'TestInternal' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestInternalConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,727] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,728] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,728] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,729] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,729] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,729] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,730] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,730] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,730] INFO Added aliases 'FaultyConverter' and 'Faulty' to plugin 'org.apache.kafka.connect.runtime.ErrorHandlingTaskTest$FaultyConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,731] INFO Added alias 'TestHeaderConverter' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-05 11:55:45,731] INFO Added aliases 'TestInternalConverter' and 'TestInternal' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestInternalConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,732] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-05 11:55:45,732] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,733] INFO Added aliases 'SampleTransformation' and 'Sample' to plugin 'org.apache.kafka.connect.runtime.AbstractHerderTest$SampleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,733] INFO Added aliases 'SimpleTransformation' and 'Simple' to plugin 'org.apache.kafka.connect.runtime.ConnectorConfigTest$SimpleTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,734] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-05 11:55:45,734] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-05 11:55:45,734] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:394) [2019-07-05 11:55:45,735] INFO Added aliases 'TestConnectRestExtension' and 'Test' to plugin 'org.apache.kafka.connect.runtime.isolation.PluginsTest$TestConnectRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,735] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,736] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,736] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:397) [2019-07-05 11:55:45,737] INFO DistributedConfig values: access.control.allow.methods = access.control.allow.origin = bootstrap.servers = [localhost:37774] client.dns.lookup = default client.id = config.providers = [] config.storage.replication.factor = 1 config.storage.topic = connect-config-topic-connect-cluster connect.protocol = compatible connections.max.idle.ms = 540000 connector.client.config.override.policy = None group.id = connect-integration-test-connect-cluster header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter heartbeat.interval.ms = 3000 internal.key.converter = class org.apache.kafka.connect.json.JsonConverter internal.value.converter = class org.apache.kafka.connect.json.JsonConverter key.converter = class org.apache.kafka.connect.storage.StringConverter listeners = null metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 5000 offset.flush.timeout.ms = 5000 offset.storage.partitions = 25 offset.storage.replication.factor = 1 offset.storage.topic = connect-offset-topic-connect-cluster plugin.path = null rebalance.timeout.ms = 60000 receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 40000 rest.advertised.host.name = null rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] rest.host.name = localhost rest.port = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI scheduled.rebalance.max.delay.ms = 300000 security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS status.storage.partitions = 5 status.storage.replication.factor = 1 status.storage.topic = connect-storage-topic-connect-cluster task.shutdown.graceful.timeout.ms = 5000 value.converter = class org.apache.kafka.connect.storage.StringConverter worker.sync.timeout.ms = 3000 worker.unsync.backoff.ms = 300000 (org.apache.kafka.connect.runtime.distributed.DistributedConfig:347) [2019-07-05 11:55:45,738] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:43) [2019-07-05 11:55:45,739] INFO AdminClientConfig values: bootstrap.servers = [localhost:37774] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2019-07-05 11:55:45,740] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,741] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,741] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,741] WARN The configuration 'rest.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,742] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,742] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,742] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,752] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,753] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,753] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,753] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,754] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,754] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-05 11:55:45,754] INFO Kafka commitId: af1bf7c90963cfba (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-05 11:55:45,755] INFO Kafka startTimeMs: 1562327745754 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-05 11:55:45,755] DEBUG Looking up Kafka cluster ID (org.apache.kafka.connect.util.ConnectUtils:50) [2019-07-05 11:55:45,756] DEBUG Fetching Kafka cluster ID (org.apache.kafka.connect.util.ConnectUtils:57) [2019-07-05 11:55:45,761] INFO Kafka cluster ID: JmoHbEUISpiH_d7_Pn5qgw (org.apache.kafka.connect.util.ConnectUtils:59) [2019-07-05 11:55:45,763] DEBUG Kafka cluster ID: JmoHbEUISpiH_d7_Pn5qgw (org.apache.kafka.connect.cli.ConnectDistributed:96) [2019-07-05 11:55:45,764] INFO Added connector for http://localhost:0 (org.apache.kafka.connect.runtime.rest.RestServer:124) [2019-07-05 11:55:45,764] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:168) [2019-07-05 11:55:45,765] INFO jetty-9.4.18.v20190429; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 11.0.1+13-LTS (org.eclipse.jetty.server.Server:370) [2019-07-05 11:55:45,769] INFO Started http_localhost0@258aadd{HTTP/1.1,[http/1.1]}{localhost:36148} (org.eclipse.jetty.server.AbstractConnector:292) [2019-07-05 11:55:45,770] INFO Started @192965ms (org.eclipse.jetty.server.Server:410) [2019-07-05 11:55:45,770] INFO Advertised URI: http://localhost:36148/ (org.apache.kafka.connect.runtime.rest.RestServer:285) [2019-07-05 11:55:45,771] INFO REST server listening at http://localhost:36148/, advertising URL http://localhost:36148/ (org.apache.kafka.connect.runtime.rest.RestServer:183) [2019-07-05 11:55:45,771] INFO Advertised URI: http://localhost:36148/ (org.apache.kafka.connect.runtime.rest.RestServer:285) [2019-07-05 11:55:45,772] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy:45) [2019-07-05 11:55:45,772] DEBUG Registering Connect metrics with JMX for worker 'localhost:36148' (org.apache.kafka.connect.runtime.ConnectMetrics:83) [2019-07-05 11:55:45,773] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-05 11:55:45,773] INFO Kafka commitId: af1bf7c90963cfba (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-05 11:55:45,773] INFO Kafka startTimeMs: 1562327745773 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-05 11:55:45,775] DEBUG Configuring the key converter with configuration keys: [] (org.apache.kafka.connect.runtime.isolation.Plugins:254) [2019-07-05 11:55:45,775] INFO JsonConverterConfig values: converter.type = key schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) [2019-07-05 11:55:45,776] DEBUG Configuring the value converter with configuration keys: [] (org.apache.kafka.connect.runtime.isolation.Plugins:254) [2019-07-05 11:55:45,776] INFO JsonConverterConfig values: converter.type = value schemas.cache.size = 1000 schemas.enable = false (org.apache.kafka.connect.json.JsonConverterConfig:347) [2019-07-05 11:55:45,778] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-05 11:55:45,779] INFO Kafka commitId: af1bf7c90963cfba (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-05 11:55:45,779] INFO Kafka startTimeMs: 1562327745778 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-05 11:55:45,779] DEBUG [Worker clientId=connect-25, groupId=connect-integration-test-connect-cluster] Connect group member created (org.apache.kafka.connect.runtime.distributed.WorkerGroupMember:139) [2019-07-05 11:55:45,780] DEBUG Kafka Connect instance created (org.apache.kafka.connect.runtime.Connect:42) [2019-07-05 11:55:45,780] INFO Kafka Connect distributed worker initialization took 999ms (org.apache.kafka.connect.cli.ConnectDistributed:128) [2019-07-05 11:55:45,780] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50) [2019-07-05 11:55:45,781] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:187) [2019-07-05 11:55:45,781] INFO [Worker clientId=connect-25, groupId=connect-integration-test-connect-cluster] Herder starting (org.apache.kafka.connect.runtime.distributed.DistributedHerder:238) [2019-07-05 11:55:45,781] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:182) [2019-07-05 11:55:45,782] INFO Starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:108) [2019-07-05 11:55:45,782] INFO Starting KafkaBasedLog with topic connect-offset-topic-connect-cluster (org.apache.kafka.connect.util.KafkaBasedLog:125) [2019-07-05 11:55:45,782] DEBUG Creating admin client to manage Connect internal offset topic (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:97) [2019-07-05 11:55:45,782] INFO AdminClientConfig values: bootstrap.servers = [localhost:37774] client.dns.lookup = default client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:347) [2019-07-05 11:55:45,784] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,784] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,784] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,785] WARN The configuration 'rest.host.name' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,785] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,785] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:365) [2019-07-05 11:55:45,785] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,786] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,785] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:370) [2019-07-05 11:55:45,786] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,786] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,786] INFO node0 Scavenging every 660000ms (org.eclipse.jetty.server.session:149) [2019-07-05 11:55:45,787] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,787] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,788] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355) [2019-07-05 11:55:45,788] INFO Kafka version: 2.4.0-SNAPSHOT (org.apache.kafka.common.utils.AppInfoParser:117) [2019-07-05 11:55:45,788] INFO Kafka commitId: af1bf7c90963cfba (org.apache.kafka.common.utils.AppInfoParser:118) [2019-07-05 11:55:45,788] INFO Kafka startTimeMs: 1562327745788 (org.apache.kafka.common.utils.AppInfoParser:119) [2019-07-05 11:55:45,797] INFO Got user-level KeeperException when processing sessionid:0x105249c73130000 type:setData cxid:0x3e zxid:0x1d txntype:-1 reqpath:n/a Error Path:/config/topics/connect-offset-topic-connect-cluster Error:KeeperErrorCode = NoNode for /config/topics/connect-offset-topic-connect-cluster (org.apache.zookeeper.server.PrepRequestProcessor:653) [2019-07-05 11:55:45,886] INFO Started o.e.j.s.ServletContextHandler@54d3c7cf{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:855) [2019-07-05 11:55:45,886] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:233) [2019-07-05 11:55:45,887] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56) [2019-07-05 11:55:45,887] INFO Started worker WorkerHandle{workerName='connect-worker-0'workerURL='http://localhost:36148/'} (org.apache.kafka.connect.util.clusters.EmbeddedConnectCluster:163) [2019-07-05 11:55 ...[truncated 51289505 chars]... s=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,735] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,735] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26306}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26306, keySchema=Schema{STRING}, value=value-simple-conn-0-26306, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,735] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,735] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26311}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26311, keySchema=Schema{STRING}, value=value-simple-conn-0-26311, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,735] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,736] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26313}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26313, keySchema=Schema{STRING}, value=value-simple-conn-0-26313, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,736] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,736] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26314}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26314, keySchema=Schema{STRING}, value=value-simple-conn-0-26314, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,736] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,737] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26315}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26315, keySchema=Schema{STRING}, value=value-simple-conn-0-26315, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,737] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,737] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26317}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26317, keySchema=Schema{STRING}, value=value-simple-conn-0-26317, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,737] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,737] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26321}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26321, keySchema=Schema{STRING}, value=value-simple-conn-0-26321, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,738] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,738] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26324}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26324, keySchema=Schema{STRING}, value=value-simple-conn-0-26324, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,738] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,738] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26325}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26325, keySchema=Schema{STRING}, value=value-simple-conn-0-26325, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,738] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,739] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26326}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26326, keySchema=Schema{STRING}, value=value-simple-conn-0-26326, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,739] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,739] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26331}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26331, keySchema=Schema{STRING}, value=value-simple-conn-0-26331, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,739] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,739] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26333}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26333, keySchema=Schema{STRING}, value=value-simple-conn-0-26333, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,740] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,740] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26336}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26336, keySchema=Schema{STRING}, value=value-simple-conn-0-26336, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,740] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,740] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26337}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26337, keySchema=Schema{STRING}, value=value-simple-conn-0-26337, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,740] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,741] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26338}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26338, keySchema=Schema{STRING}, value=value-simple-conn-0-26338, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,741] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,741] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26341}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26341, keySchema=Schema{STRING}, value=value-simple-conn-0-26341, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,741] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,741] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26344}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26344, keySchema=Schema{STRING}, value=value-simple-conn-0-26344, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,742] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,742] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26351}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26351, keySchema=Schema{STRING}, value=value-simple-conn-0-26351, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,742] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,742] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26353}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26353, keySchema=Schema{STRING}, value=value-simple-conn-0-26353, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,743] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,743] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26360}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26360, keySchema=Schema{STRING}, value=value-simple-conn-0-26360, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,743] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,743] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26363}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26363, keySchema=Schema{STRING}, value=value-simple-conn-0-26363, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,744] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,744] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26366}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26366, keySchema=Schema{STRING}, value=value-simple-conn-0-26366, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,744] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,744] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26375}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26375, keySchema=Schema{STRING}, value=value-simple-conn-0-26375, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,744] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,745] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26376}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26376, keySchema=Schema{STRING}, value=value-simple-conn-0-26376, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,745] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,745] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26381}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26381, keySchema=Schema{STRING}, value=value-simple-conn-0-26381, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,745] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,745] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26383}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26383, keySchema=Schema{STRING}, value=value-simple-conn-0-26383, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,746] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,746] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26384}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26384, keySchema=Schema{STRING}, value=value-simple-conn-0-26384, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,746] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,746] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26385}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26385, keySchema=Schema{STRING}, value=value-simple-conn-0-26385, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,746] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,747] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26386}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26386, keySchema=Schema{STRING}, value=value-simple-conn-0-26386, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,747] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,747] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26395}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26395, keySchema=Schema{STRING}, value=value-simple-conn-0-26395, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,747] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,748] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26403}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26403, keySchema=Schema{STRING}, value=value-simple-conn-0-26403, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,748] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,748] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26405}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26405, keySchema=Schema{STRING}, value=value-simple-conn-0-26405, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,748] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,748] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26407}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26407, keySchema=Schema{STRING}, value=value-simple-conn-0-26407, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,749] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,749] INFO shutting down (org.apache.zookeeper.server.ZooKeeperServer:502) [2019-07-05 11:56:43,749] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26408}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26408, keySchema=Schema{STRING}, value=value-simple-conn-0-26408, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,749] INFO Shutting down (org.apache.zookeeper.server.SessionTrackerImpl:226) [2019-07-05 11:56:43,750] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,750] INFO Shutting down (org.apache.zookeeper.server.PrepRequestProcessor:769) [2019-07-05 11:56:43,750] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26415}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26415, keySchema=Schema{STRING}, value=value-simple-conn-0-26415, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,751] INFO PrepRequestProcessor exited loop! (org.apache.zookeeper.server.PrepRequestProcessor:144) [2019-07-05 11:56:43,751] INFO Shutting down (org.apache.zookeeper.server.SyncRequestProcessor:208) [2019-07-05 11:56:43,751] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,752] INFO SyncRequestProcessor exited! (org.apache.zookeeper.server.SyncRequestProcessor:186) [2019-07-05 11:56:43,752] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26416}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26416, keySchema=Schema{STRING}, value=value-simple-conn-0-26416, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,753] INFO shutdown of request processor complete (org.apache.zookeeper.server.FinalRequestProcessor:430) [2019-07-05 11:56:43,753] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,753] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26424}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26424, keySchema=Schema{STRING}, value=value-simple-conn-0-26424, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,754] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,754] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26425}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26425, keySchema=Schema{STRING}, value=value-simple-conn-0-26425, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,754] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,754] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26426}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26426, keySchema=Schema{STRING}, value=value-simple-conn-0-26426, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,754] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,755] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26428}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26428, keySchema=Schema{STRING}, value=value-simple-conn-0-26428, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,755] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,755] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26429}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26429, keySchema=Schema{STRING}, value=value-simple-conn-0-26429, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,755] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,756] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26431}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26431, keySchema=Schema{STRING}, value=value-simple-conn-0-26431, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,756] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,756] INFO NIOServerCnxn factory exited run method (org.apache.zookeeper.server.NIOServerCnxnFactory:249) [2019-07-05 11:56:43,756] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26432}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26432, keySchema=Schema{STRING}, value=value-simple-conn-0-26432, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,756] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,757] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26433}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26433, keySchema=Schema{STRING}, value=value-simple-conn-0-26433, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,757] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,757] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26436}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26436, keySchema=Schema{STRING}, value=value-simple-conn-0-26436, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331) [2019-07-05 11:56:43,757] ERROR [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} failed to send record to test-topic: (org.apache.kafka.connect.runtime.WorkerSourceTask:330) org.apache.kafka.common.KafkaException: Producer is closed forcefully. at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortBatches(RecordAccumulator.java:729) at org.apache.kafka.clients.producer.internals.RecordAccumulator.abortIncompleteBatches(RecordAccumulator.java:716) at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:278) at java.base/java.lang.Thread.run(Thread.java:834) [2019-07-05 11:56:43,757] DEBUG [simple-conn|task-0] WorkerSourceTask{id=simple-conn-0} Failed record: SourceRecord{sourcePartition={task.id=simple-conn-0}, sourceOffset={saved=26439}} ConnectRecord{topic='test-topic', kafkaPartition=null, key=key-simple-conn-0-26439, keySchema=Schema{STRING}, value=value-simple-conn-0-26439, valueSchema=Schema{STRING}, timestamp=null, headers=ConnectHeaders(headers=)} (org.apache.kafka.connect.runtime.WorkerSourceTask:331)