2021/03/06 06:44:11 Starting coyote-tester 2021/03/06 06:44:11 Starting processing group: [ Brokers ] 2021/03/06 06:44:26 Success, command 'bash -c 'for ((i=0;i<60;i++)); do sleep 2; echo dump | nc 127.0.0.1 2181 | grep brokers && { sleep 5; break; }; done'', test 'Wait for broker to get up'. Stdout: "\t/brokers/ids/0\n" 2021/03/06 06:44:29 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote_basic_1615013066529 --partitions 1 --replication-factor 1 --create --config retention.ms=60000 --config retention.bytes=52428800 ', test 'Create Topic (basic kafka)'. Stdout: "WARNING: Due to limitations in metric names, topics with a period ('.') or underscore ('_') could collide. To avoid issues it is best to use either, but not both.\nCreated topic coyote_basic_1615013066529.\n" 2021/03/06 06:44:31 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --list', test 'List Topics (basic kafka)'. Stdout: "__consumer_offsets\n_schemas\ncoyote_basic_1615013066529\n" 2021/03/06 06:44:44 Success, command 'kafka-producer-perf-test --topic coyote_basic_1615013066529 --throughput 100000 --record-size 1000 --num-records 500000 --producer-props bootstrap.servers="127.0.0.1:9092" ', test 'Performance Test (basic kafka)'. Stdout: "158449 records sent, 31689.8 records/sec (30.22 MB/sec), 863.6 ms avg latency, 1336.0 ms max latency.\n292320 records sent, 58464.0 records/sec (55.76 MB/sec), 559.9 ms avg latency, 661.0 ms max latency.\n500000 records sent, 46304.871272 records/sec (44.16 MB/sec), 651.39 ms avg latency, 1336.00 ms max latency, 573 ms 50th, 1113 ms 95th, 1302 ms 99th, 1334 ms 99.9th.\n" 2021/03/06 06:44:44 Starting processing group: [ REST Proxy ] 2021/03/06 06:44:48 Success, command 'bash -c 'for ((i=0;i<90;i++)); do sleep 2; curl "http://127.0.0.1:8082" | grep "{}" && { sleep 2; break; }; done'', test 'Wait for rest proxy to get up'. Stdout: "{}\n" 2021/03/06 06:44:48 Success, command 'curl -vs --stderr - "http://127.0.0.1:8082/topics"', test 'List Topics (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> GET /topics HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:44:48 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 139\r\n< \r\n{ [139 bytes data]\n[\"_schemas\",\"backblaze_smart\",\"coyote_basic_1615013066529\",\"nyc_yellow_taxi_trip_data\",\"sea_vessel_position_reports\",\"telecom_italia_data\"]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:44:49 Success, command 'curl -vs --stderr - "http://127.0.0.1:8082/topics/coyote_basic_1615013066529"', test 'Topic Information (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> GET /topics/coyote_basic_1615013066529 HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:44:48 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 1017\r\n< \r\n{ [1017 bytes data]\n{\"name\":\"coyote_basic_1615013066529\",\"configs\":{\"message.downconversion.enable\":\"true\",\"file.delete.delay.ms\":\"60000\",\"segment.ms\":\"604800000\",\"min.compaction.lag.ms\":\"0\",\"retention.bytes\":\"52428800\",\"segment.index.bytes\":\"10485760\",\"cleanup.policy\":\"delete\",\"max.compaction.lag.ms\":\"9223372036854775807\",\"follower.replication.throttled.replicas\":\"\",\"message.timestamp.difference.max.ms\":\"9223372036854775807\",\"segment.jitter.ms\":\"0\",\"preallocate\":\"false\",\"message.timestamp.type\":\"CreateTime\",\"message.format.version\":\"2.5-IV0\",\"segment.bytes\":\"104857600\",\"unclean.leader.election.enable\":\"false\",\"max.message.bytes\":\"1048588\",\"retention.ms\":\"60000\",\"flush.ms\":\"9223372036854775807\",\"delete.retention.ms\":\"86400000\",\"leader.replication.throttled.replicas\":\"\",\"min.insync.replicas\":\"1\",\"flush.messages\":\"9223372036854775807\",\"compression.type\":\"producer\",\"index.interval.bytes\":\"4096\",\"min.cleanable.dirty.ratio\":\"0.5\"},\"partitions\":[{\"partition\":0,\"leader\":0,\"replicas\":[{\"broker\":0,\"leader\":true,\"in_sync\":true}]}]}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:44:49 Success, command 'curl -vs --stderr - "http://127.0.0.1:8082/topics/coyote_basic_1615013066529/partitions"', test 'Topic Partitions (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> GET /topics/coyote_basic_1615013066529/partitions HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:44:49 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 83\r\n< \r\n{ [83 bytes data]\n[{\"partition\":0,\"leader\":0,\"replicas\":[{\"broker\":0,\"leader\":true,\"in_sync\":true}]}]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:44:51 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote_basic_1615013066529 --delete', test 'Delete Topic (basic kafka)'. Stdout: "Topic coyote_basic_1615013066529 is marked for deletion.\nNote: This will have no impact if delete.topic.enable is not set to true.\n" 2021/03/06 06:44:52 Success, command 'curl -vs --stderr - -XPOST -H "Content-Type: application/vnd.kafka.avro.v1+json" --data '{"value_schema": "{\"type\": \"record\", \"name\": \"User\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}", "records": [{"value": {"name": "testUser"}}]}' "http://127.0.0.1:8082/topics/coyote-test-avro" ', test 'Produce Avro Message (rest proxy, schema registry)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> POST /topics/coyote-test-avro HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.kafka.avro.v1+json\r\n> Content-Length: 180\r\n> \r\n} [180 bytes data]\n* upload completely sent off: 180 out of 180 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:44:51 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 112\r\n< \r\n{ [112 bytes data]\n{\"offsets\":[{\"partition\":0,\"offset\":0,\"error_code\":null,\"error\":null}],\"key_schema_id\":null,\"value_schema_id\":1}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:44:52 Success, command 'curl -vs --stderr - -XPOST -H "Content-Type: application/vnd.kafka.v1+json" --data '{"name": "a-consumer", "format": "avro", "auto.offset.reset": "smallest"}' "http://127.0.0.1:8082/consumers/coyote-avro" ', test 'Create Consumer for Avro data (rest proxy, schema registry)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> POST /consumers/coyote-avro HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.kafka.v1+json\r\n> Content-Length: 73\r\n> \r\n} [73 bytes data]\n* upload completely sent off: 73 out of 73 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:44:52 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 106\r\n< \r\n{ [106 bytes data]\n{\"instance_id\":\"a-consumer\",\"base_uri\":\"http://127.0.0.1:8082/consumers/coyote-avro/instances/a-consumer\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:44:57 Success, command 'sleep 5', test ''. Stdout: "" 2021/03/06 06:45:17 Success, command 'curl -vs --stderr - -XGET -H "Accept: application/vnd.kafka.avro.v1+json" "http://127.0.0.1:8082/consumers/coyote-avro/instances/a-consumer/topics/coyote-test-avro?max_bytes=30" ', test 'Consume Avro Message (rest proxy, schema registry)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> GET /consumers/coyote-avro/instances/a-consumer/topics/coyote-test-avro?max_bytes=30 HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: application/vnd.kafka.avro.v1+json\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:44:57 GMT\r\n< Content-Type: application/vnd.kafka.avro.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 94\r\n< \r\n{ [94 bytes data]\n[{\"topic\":\"coyote-test-avro\",\"key\":null,\"value\":{\"name\":\"testUser\"},\"partition\":0,\"offset\":0}]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:19 Success, command 'curl -vs --stderr - -X DELETE "http://127.0.0.1:8082/consumers/coyote-avro/instances/a-consumer"', test 'Delete Avro Consumer (rest proxy, schema registry)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> DELETE /consumers/coyote-avro/instances/a-consumer HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 204 No Content\r\n< Date: Sat, 06 Mar 2021 06:45:17 GMT\r\n< \r\n* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:24 Success, command 'sleep 5', test ''. Stdout: "" 2021/03/06 06:45:29 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote-test-avro --delete', test ''. Stdout: "Topic coyote-test-avro is marked for deletion.\nNote: This will have no impact if delete.topic.enable is not set to true.\n" 2021/03/06 06:45:29 Success, command 'curl -vs --stderr - -XPOST -H "Content-Type: application/vnd.kafka.json.v1+json" --data '{"records":[{"value":{"foo":"bar"}}]}' "http://127.0.0.1:8082/topics/coyote-test-json" ', test 'Produce JSON Message (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> POST /topics/coyote-test-json HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.kafka.json.v1+json\r\n> Content-Length: 37\r\n> \r\n} [37 bytes data]\n* upload completely sent off: 37 out of 37 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:45:29 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 115\r\n< \r\n{ [115 bytes data]\n{\"offsets\":[{\"partition\":0,\"offset\":0,\"error_code\":null,\"error\":null}],\"key_schema_id\":null,\"value_schema_id\":null}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:29 Success, command 'curl -vs --stderr - -XPOST -H "Content-Type: application/vnd.kafka.v1+json" --data '{"name": "a-consumer", "format": "json", "auto.offset.reset": "smallest"}' "http://127.0.0.1:8082/consumers/coyote-json" ', test 'Create Consumer for JSON data (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> POST /consumers/coyote-json HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.kafka.v1+json\r\n> Content-Length: 73\r\n> \r\n} [73 bytes data]\n* upload completely sent off: 73 out of 73 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:45:29 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 106\r\n< \r\n{ [106 bytes data]\n{\"instance_id\":\"a-consumer\",\"base_uri\":\"http://127.0.0.1:8082/consumers/coyote-json/instances/a-consumer\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:31 Success, command 'sleep 2', test ''. Stdout: "" 2021/03/06 06:45:51 Success, command 'curl -vs --stderr - -XGET -H "Accept: application/vnd.kafka.json.v1+json" "http://127.0.0.1:8082/consumers/coyote-json/instances/a-consumer/topics/coyote-test-json?max_bytes=15" ', test 'Consume JSON Message (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> GET /consumers/coyote-json/instances/a-consumer/topics/coyote-test-json?max_bytes=15 HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: application/vnd.kafka.json.v1+json\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:45:31 GMT\r\n< Content-Type: application/vnd.kafka.json.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 88\r\n< \r\n{ [88 bytes data]\n[{\"topic\":\"coyote-test-json\",\"key\":null,\"value\":{\"foo\":\"bar\"},\"partition\":0,\"offset\":0}]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:52 Success, command 'curl -vs --stderr - -X DELETE "http://127.0.0.1:8082/consumers/coyote-json/instances/a-consumer"', test 'Delete JSON Consumer (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> DELETE /consumers/coyote-json/instances/a-consumer HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 204 No Content\r\n< Date: Sat, 06 Mar 2021 06:45:51 GMT\r\n< \r\n* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:54 Success, command 'sleep 2', test ''. Stdout: "" 2021/03/06 06:45:56 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote-test-json --delete', test ''. Stdout: "Topic coyote-test-json is marked for deletion.\nNote: This will have no impact if delete.topic.enable is not set to true.\n" 2021/03/06 06:45:56 Success, command 'curl -vs --stderr - -XPOST -H "Content-Type: application/vnd.kafka.binary.v1+json" --data '{"records":[{"value":"S2Fma2E="}]}' "http://127.0.0.1:8082/topics/coyote-test-binary" ', test 'Produce Binary Message (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> POST /topics/coyote-test-binary HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.kafka.binary.v1+json\r\n> Content-Length: 34\r\n> \r\n} [34 bytes data]\n* upload completely sent off: 34 out of 34 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:45:56 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 115\r\n< \r\n{ [115 bytes data]\n{\"offsets\":[{\"partition\":0,\"offset\":0,\"error_code\":null,\"error\":null}],\"key_schema_id\":null,\"value_schema_id\":null}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:56 Success, command 'curl -vs --stderr - -XPOST -H "Content-Type: application/vnd.kafka.v1+json" --data '{"name": "a-consumer", "format": "binary", "auto.offset.reset": "smallest"}' "http://127.0.0.1:8082/consumers/coyote-binary" ', test 'Create Consumer for Binary data (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> POST /consumers/coyote-binary HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.kafka.v1+json\r\n> Content-Length: 75\r\n> \r\n} [75 bytes data]\n* upload completely sent off: 75 out of 75 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:45:56 GMT\r\n< Content-Type: application/vnd.kafka.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 108\r\n< \r\n{ [108 bytes data]\n{\"instance_id\":\"a-consumer\",\"base_uri\":\"http://127.0.0.1:8082/consumers/coyote-binary/instances/a-consumer\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:45:58 Success, command 'sleep 2', test ''. Stdout: "" 2021/03/06 06:46:18 Success, command 'curl -vs --stderr - -XGET -H "Accept: application/vnd.kafka.binary.v1+json" "http://127.0.0.1:8082/consumers/coyote-binary/instances/a-consumer/topics/coyote-test-binary?max_bytes=10" ', test 'Consume Binary Message (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> GET /consumers/coyote-binary/instances/a-consumer/topics/coyote-test-binary?max_bytes=10 HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: application/vnd.kafka.binary.v1+json\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:45:58 GMT\r\n< Content-Type: application/vnd.kafka.binary.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\n< Content-Length: 87\r\n< \r\n{ [87 bytes data]\n[{\"topic\":\"coyote-test-binary\",\"key\":null,\"value\":\"S2Fma2E=\",\"partition\":0,\"offset\":0}]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:18 Success, command 'curl -vs --stderr - -XDELETE "http://127.0.0.1:8082/consumers/coyote-binary/instances/a-consumer"', test 'Delete Binary Consumer (rest proxy)'. Stdout: "* Trying 127.0.0.1:8082...\n* Connected to 127.0.0.1 (127.0.0.1) port 8082 (#0)\n> DELETE /consumers/coyote-binary/instances/a-consumer HTTP/1.1\r\n> Host: 127.0.0.1:8082\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 204 No Content\r\n< Date: Sat, 06 Mar 2021 06:46:18 GMT\r\n< \r\n* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:23 Success, command 'sleep 5', test ''. Stdout: "" 2021/03/06 06:46:26 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote-test-binary --delete', test ''. Stdout: "Topic coyote-test-binary is marked for deletion.\nNote: This will have no impact if delete.topic.enable is not set to true.\n" 2021/03/06 06:46:26 Starting processing group: [ Schema Registry ] 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XPOST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\": \"string\"}"}' "http://127.0.0.1:8081/subjects/coyote_basic/versions" ', test 'Register a new Schema version (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> POST /subjects/coyote_basic/versions HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.schemaregistry.v1+json\r\n> Content-Length: 36\r\n> \r\n} [36 bytes data]\n* upload completely sent off: 36 out of 36 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 8\r\nContent-Length: 8\r\n\r\n< \r\n{ [8 bytes data]\n{\"id\":8}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XGET -i "http://127.0.0.1:8081/subjects"', test 'List subjects (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> GET /subjects HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 273\r\nContent-Length: 273\r\n\r\n< \r\n{ [273 bytes data]\n[\"coyote-test-avro-value\",\"telecom_italia_data-key\",\"telecom_italia_grid-key\",\"telecom_italia_data-value\",\"coyote_basic\",\"nyc_yellow_taxi_trip_data-value\",\"sea_vessel_position_reports-key\",\"sea_vessel_position_reports-value\",\"logs_broker-value\",\"telecom_italia_grid-value\"]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XGET -i "http://127.0.0.1:8081/subjects/coyote_basic/versions"', test 'List Schema versions (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> GET /subjects/coyote_basic/versions HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 3\r\nContent-Length: 3\r\n\r\n< \r\n{ [3 bytes data]\n[1]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XGET -i "http://127.0.0.1:8081/subjects/coyote_basic/versions/1"', test 'Fetch Schema by name and version (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> GET /subjects/coyote_basic/versions/1 HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 67\r\nContent-Length: 67\r\n\r\n< \r\n{ [67 bytes data]\n{\"subject\":\"coyote_basic\",\"version\":1,\"id\":8,\"schema\":\"\\\"string\\\"\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XPOST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\": \"record\", \"name\": \"User\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}]}"}' "http://127.0.0.1:8081/subjects/coyote_test_02/versions" ', test 'Register Complex Schema (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> POST /subjects/coyote_test_02/versions HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.schemaregistry.v1+json\r\n> Content-Length: 114\r\n> \r\n} [114 bytes data]\n* upload completely sent off: 114 out of 114 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 8\r\nContent-Length: 8\r\n\r\n< \r\n{ [8 bytes data]\n{\"id\":1}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XPOST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\": \"record\", \"name\": \"User\", \"fields\": [{\"name\": \"name\", \"type\": \"string\"}, {\"name\": \"address\", \"type\": \"string\"}]}"}' "http://127.0.0.1:8081/compatibility/subjects/coyote_test_02/versions/latest" ', test 'Test Schema Compatibility (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> POST /compatibility/subjects/coyote_test_02/versions/latest HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/vnd.schemaregistry.v1+json\r\n> Content-Length: 161\r\n> \r\n} [161 bytes data]\n* upload completely sent off: 161 out of 161 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 23\r\nContent-Length: 23\r\n\r\n< \r\n{ [23 bytes data]\n{\"is_compatible\":false}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Success, command 'curl -vs --stderr - -XGET -i "http://127.0.0.1:8081/config"', test 'Get Schema Registry Configuration (schema registry)'. Stdout: "* Trying 127.0.0.1:8081...\n* Connected to 127.0.0.1 (127.0.0.1) port 8081 (#0)\n> GET /config HTTP/1.1\r\n> Host: 127.0.0.1:8081\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:26 GMT\r\nDate: Sat, 06 Mar 2021 06:46:26 GMT\r\n< Content-Type: application/vnd.schemaregistry.v1+json\r\nContent-Type: application/vnd.schemaregistry.v1+json\r\n< Vary: Accept-Encoding, User-Agent\r\nVary: Accept-Encoding, User-Agent\r\n< Content-Length: 33\r\nContent-Length: 33\r\n\r\n< \r\n{ [33 bytes data]\n{\"compatibilityLevel\":\"BACKWARD\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:26 Starting processing group: [ Connect ] 2021/03/06 06:46:36 Success, command 'bash -c 'for ((i=0;i<60;i++)); do sleep 5; curl "http://127.0.0.1:8083/connectors" && { sleep 5; break; }; done'', test 'Wait for connect to get up'. Stdout: "[\"logs-broker\"]" 2021/03/06 06:46:36 Success, command 'curl -vs --stderr - -XGET -i "http://127.0.0.1:8083/connectors"', test 'Get list of Connectors (connect distributed)'. Stdout: "* Trying 127.0.0.1:8083...\n* Connected to 127.0.0.1 (127.0.0.1) port 8083 (#0)\n> GET /connectors HTTP/1.1\r\n> Host: 127.0.0.1:8083\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:36 GMT\r\nDate: Sat, 06 Mar 2021 06:46:36 GMT\r\n< Content-Type: application/json\r\nContent-Type: application/json\r\n< Content-Length: 15\r\nContent-Length: 15\r\n< Server: Jetty(9.4.24.v20191120)\r\nServer: Jetty(9.4.24.v20191120)\r\n\r\n< \r\n{ [15 bytes data]\n[\"logs-broker\"]* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:37 Success, command 'curl -vs --stderr - -X POST -H "Content-Type: application/json" --data '{ "name": "coyote_test_console_source-1615013196744", "config": {"connector.class":"org.apache.kafka.connect.file.FileStreamSourceConnector","tasks.max":"1","topic":"coyote_cd-1615013196744","file":"/etc/fstab"}}' "http://127.0.0.1:8083/connectors" ', test 'Create a Console Connector (connect distributed)'. Stdout: "* Trying 127.0.0.1:8083...\n* Connected to 127.0.0.1 (127.0.0.1) port 8083 (#0)\n> POST /connectors HTTP/1.1\r\n> Host: 127.0.0.1:8083\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> Content-Type: application/json\r\n> Content-Length: 219\r\n> \r\n} [219 bytes data]\n* upload completely sent off: 219 out of 219 bytes\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 201 Created\r\n< Date: Sat, 06 Mar 2021 06:46:36 GMT\r\n< Location: http://127.0.0.1:8083/connectors/coyote_test_console_source-1615013196744\r\n< Content-Type: application/json\r\n< Content-Length: 285\r\n< Server: Jetty(9.4.24.v20191120)\r\n< \r\n{ [285 bytes data]\n{\"name\":\"coyote_test_console_source-1615013196744\",\"config\":{\"connector.class\":\"org.apache.kafka.connect.file.FileStreamSourceConnector\",\"tasks.max\":\"1\",\"topic\":\"coyote_cd-1615013196744\",\"file\":\"/etc/fstab\",\"name\":\"coyote_test_console_source-1615013196744\"},\"tasks\":[],\"type\":\"source\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:47 Success, command 'sleep 10', test 'Sleep a bit to let the connector spawn and work'. Stdout: "" 2021/03/06 06:46:47 Success, command 'curl -vs --stderr - -XGET -i "http://127.0.0.1:8083/connectors/coyote_test_console_source-1615013196744"', test 'Get Connector s Configuration (connect distributed)'. Stdout: "* Trying 127.0.0.1:8083...\n* Connected to 127.0.0.1 (127.0.0.1) port 8083 (#0)\n> GET /connectors/coyote_test_console_source-1615013196744 HTTP/1.1\r\n> Host: 127.0.0.1:8083\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 200 OK\r\nHTTP/1.1 200 OK\r\n< Date: Sat, 06 Mar 2021 06:46:47 GMT\r\nDate: Sat, 06 Mar 2021 06:46:47 GMT\r\n< Content-Type: application/json\r\nContent-Type: application/json\r\n< Content-Length: 350\r\nContent-Length: 350\r\n< Server: Jetty(9.4.24.v20191120)\r\nServer: Jetty(9.4.24.v20191120)\r\n\r\n< \r\n{ [350 bytes data]\n{\"name\":\"coyote_test_console_source-1615013196744\",\"config\":{\"connector.class\":\"org.apache.kafka.connect.file.FileStreamSourceConnector\",\"file\":\"/etc/fstab\",\"tasks.max\":\"1\",\"name\":\"coyote_test_console_source-1615013196744\",\"topic\":\"coyote_cd-1615013196744\"},\"tasks\":[{\"connector\":\"coyote_test_console_source-1615013196744\",\"task\":0}],\"type\":\"source\"}* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:46:56 Success, command 'timeout 10 kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic coyote_cd-1615013196744 --from-beginning --timeout-ms 5000 ', test 'Run Console Consumer to fix Kafka's transient state (basic kafka)'. Stdout: "\x00\x00\x00\x00\bZ/dev/cdrom\t/media/cdrom\tiso9660\tnoauto,ro 0 0\n\x00\x00\x00\x00\bT/dev/usbdisk\t/media/usb\tvfat\tnoauto,ro 0 0\n" 2021/03/06 06:47:04 Success, command 'timeout 10 kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic coyote_cd-1615013196744 --from-beginning --timeout-ms 5000 ', test 'Run Console Consumer (basic kafka)'. Stdout: "\x00\x00\x00\x00\bZ/dev/cdrom\t/media/cdrom\tiso9660\tnoauto,ro 0 0\n\x00\x00\x00\x00\bT/dev/usbdisk\t/media/usb\tvfat\tnoauto,ro 0 0\n" 2021/03/06 06:47:05 Success, command 'curl -vs --stderr - -XDELETE "http://127.0.0.1:8083/connectors/coyote_test_console_source-1615013196744"', test 'Delete connector'. Stdout: "* Trying 127.0.0.1:8083...\n* Connected to 127.0.0.1 (127.0.0.1) port 8083 (#0)\n> DELETE /connectors/coyote_test_console_source-1615013196744 HTTP/1.1\r\n> Host: 127.0.0.1:8083\r\n> User-Agent: curl/7.74.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\n< HTTP/1.1 204 No Content\r\n< Date: Sat, 06 Mar 2021 06:47:04 GMT\r\n< Server: Jetty(9.4.24.v20191120)\r\n< \r\n* Connection #0 to host 127.0.0.1 left intact\n" 2021/03/06 06:47:10 Success, command 'sleep 5', test ''. Stdout: "" 2021/03/06 06:47:12 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote_cd-1615013196744 --delete', test 'Delete Connect Distributes Test Topic (basic kafka)'. Stdout: "Topic coyote_cd-1615013196744 is marked for deletion.\nNote: This will have no impact if delete.topic.enable is not set to true.\n" 2021/03/06 06:47:12 Success, command 'rm -rf coyote_test.sqlite coyote_sqlite_connector.properties coyote_connect_standalone.properties coyote_connect.offset', test ''. Stdout: "" 2021/03/06 06:47:13 Success, command 'sqlite3 coyote_test.sqlite', test 'Create and Init SQLite database'. Stdout: "" 2021/03/06 06:47:13 Success, command 'tee coyote_sqlite_connector.properties', test 'Create coyote_sqlite_connector.properties'. Stdout: "name=coyote-ca-1615013233026\nconnector.class=io.confluent.connect.jdbc.JdbcSourceConnector\ntasks.max=1\nconnection.url=jdbc:sqlite:coyote_test.sqlite\nmode=incrementing\nincrementing.column.name=id\ntopic.prefix=coyote-ca-\n" 2021/03/06 06:47:13 Success, command 'tee coyote_connect_standalone.properties', test 'Create coyote_connect_standalone.properties'. Stdout: "bootstrap.servers=127.0.0.1:9092\nkey.converter=io.confluent.connect.avro.AvroConverter\nkey.converter.schema.registry.url=http://127.0.0.1:8081\nvalue.converter=io.confluent.connect.avro.AvroConverter\nvalue.converter.schema.registry.url=http://127.0.0.1:8081\ninternal.key.converter=org.apache.kafka.connect.json.JsonConverter\ninternal.value.converter=org.apache.kafka.connect.json.JsonConverter\ninternal.key.converter.schemas.enable=false\ninternal.value.converter.schemas.enable=false\noffset.storage.file.filename=coyote_connect.offset\noffset.flush.interval.ms=5000\nrest.port=38783\nport=38783\nplugin.path=/opt/landoop/connectors/third-party\n" 2021/03/06 06:47:58 Success, command 'timeout -k 5 45 connect-standalone coyote_connect_standalone.properties coyote_sqlite_connector.properties', test 'Read SQLite into Topic (connect standalone)'. Stdout: "[2021-03-06 06:47:15,936] INFO Kafka Connect standalone worker initializing ... (org.apache.kafka.connect.cli.ConnectStandalone:69)\n[2021-03-06 06:47:15,947] INFO WorkerInfo values: \n\tjvm.args = -Xmx640M, -Xms128M, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dcom.sun.management.jmxremote.local.only=false, -Djava.rmi.server.hostname=127.0.0.1, -Dcom.sun.management.jmxremote.rmi.port=9584, -Dkafka.logs.dir=/opt/landoop/kafka/logs, -Dlog4j.configuration=file:/var/run/connect/connect-log4j.properties\n\tjvm.spec = IcedTea, OpenJDK 64-Bit Server VM, 1.8.0_275, 25.275-b01\n\tjvm.classpath = /opt/landoop/kafka/share/java/kafka/activation-1.1.1.jar:/opt/landoop/kafka/share/java/kafka/aopalliance-repackaged-2.5.0.jar:/opt/landoop/kafka/share/java/kafka/argparse4j-0.7.0.jar:/opt/landoop/kafka/share/java/kafka/audience-annotations-0.5.0.jar:/opt/landoop/kafka/share/java/kafka/commons-cli-1.4.jar:/opt/landoop/kafka/share/java/kafka/commons-lang3-3.8.1.jar:/opt/landoop/kafka/share/java/kafka/connect-api-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-basic-auth-extension-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-file-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-json-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-mirror-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-mirror-client-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-runtime-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/connect-transforms-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/hk2-api-2.5.0.jar:/opt/landoop/kafka/share/java/kafka/hk2-locator-2.5.0.jar:/opt/landoop/kafka/share/java/kafka/hk2-utils-2.5.0.jar:/opt/landoop/kafka/share/java/kafka/jackson-annotations-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-core-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-databind-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-dataformat-csv-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-datatype-jdk8-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-jaxrs-base-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-jaxrs-json-provider-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-module-jaxb-annotations-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-module-paranamer-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jackson-module-scala_2.12-2.10.2.jar:/opt/landoop/kafka/share/java/kafka/jakarta.activation-api-1.2.1.jar:/opt/landoop/kafka/share/java/kafka/jakarta.annotation-api-1.3.4.jar:/opt/landoop/kafka/share/java/kafka/jakarta.inject-2.5.0.jar:/opt/landoop/kafka/share/java/kafka/jakarta.ws.rs-api-2.1.5.jar:/opt/landoop/kafka/share/java/kafka/jakarta.xml.bind-api-2.3.2.jar:/opt/landoop/kafka/share/java/kafka/javassist-3.22.0-CR2.jar:/opt/landoop/kafka/share/java/kafka/javassist-3.26.0-GA.jar:/opt/landoop/kafka/share/java/kafka/javax.servlet-api-3.1.0.jar:/opt/landoop/kafka/share/java/kafka/javax.ws.rs-api-2.1.1.jar:/opt/landoop/kafka/share/java/kafka/jaxb-api-2.3.0.jar:/opt/landoop/kafka/share/java/kafka/jersey-client-2.28.jar:/opt/landoop/kafka/share/java/kafka/jersey-common-2.28.jar:/opt/landoop/kafka/share/java/kafka/jersey-container-servlet-2.28.jar:/opt/landoop/kafka/share/java/kafka/jersey-container-servlet-core-2.28.jar:/opt/landoop/kafka/share/java/kafka/jersey-hk2-2.28.jar:/opt/landoop/kafka/share/java/kafka/jersey-media-jaxb-2.28.jar:/opt/landoop/kafka/share/java/kafka/jersey-server-2.28.jar:/opt/landoop/kafka/share/java/kafka/jetty-client-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-continuation-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-http-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-io-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-security-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-server-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-servlet-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-servlets-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jetty-util-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/kafka/jopt-simple-5.0.4.jar:/opt/landoop/kafka/share/java/kafka/kafka-clients-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka-custom-principal-builder-1.0-SNAPSHOT.jar:/opt/landoop/kafka/share/java/kafka/kafka-log4j-appender-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka-streams-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka-streams-examples-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka-streams-scala_2.12-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka-streams-test-utils-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka-tools-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/kafka_2.12-2.5.1-L0-sources.jar:/opt/landoop/kafka/share/java/kafka/kafka_2.12-2.5.1-L0.jar:/opt/landoop/kafka/share/java/kafka/log4j-1.2.17.jar:/opt/landoop/kafka/share/java/kafka/lz4-java-1.7.1.jar:/opt/landoop/kafka/share/java/kafka/maven-artifact-3.6.3.jar:/opt/landoop/kafka/share/java/kafka/metrics-core-2.2.0.jar:/opt/landoop/kafka/share/java/kafka/netty-buffer-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-codec-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-common-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-handler-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-resolver-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-transport-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-transport-native-epoll-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/netty-transport-native-unix-common-4.1.50.Final.jar:/opt/landoop/kafka/share/java/kafka/osgi-resource-locator-1.0.1.jar:/opt/landoop/kafka/share/java/kafka/paranamer-2.8.jar:/opt/landoop/kafka/share/java/kafka/plexus-utils-3.2.1.jar:/opt/landoop/kafka/share/java/kafka/reflections-0.9.12.jar:/opt/landoop/kafka/share/java/kafka/rocksdbjni-5.18.3.jar:/opt/landoop/kafka/share/java/kafka/scala-collection-compat_2.12-2.1.3.jar:/opt/landoop/kafka/share/java/kafka/scala-java8-compat_2.12-0.9.0.jar:/opt/landoop/kafka/share/java/kafka/scala-library-2.12.10.jar:/opt/landoop/kafka/share/java/kafka/scala-logging_2.12-3.9.2.jar:/opt/landoop/kafka/share/java/kafka/scala-reflect-2.12.10.jar:/opt/landoop/kafka/share/java/kafka/slf4j-api-1.7.30.jar:/opt/landoop/kafka/share/java/kafka/slf4j-log4j12-1.7.30.jar:/opt/landoop/kafka/share/java/kafka/snappy-java-1.1.7.3.jar:/opt/landoop/kafka/share/java/kafka/validation-api-2.0.1.Final.jar:/opt/landoop/kafka/share/java/kafka/zookeeper-3.5.8.jar:/opt/landoop/kafka/share/java/kafka/zookeeper-jute-3.5.8.jar:/opt/landoop/kafka/share/java/kafka/zstd-jni-1.4.4-7.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/animal-sniffer-annotations-1.14.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/annotations-13.0.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/avro-1.9.2.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/checker-compat-qual-2.0.0.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/classgraph-4.8.21.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/commons-collections-3.2.2.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/commons-compress-1.19.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/commons-digester-1.8.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/commons-logging-1.2.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/commons-validator-1.6.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/error_prone_annotations-2.3.4.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/gson-2.8.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/guava-24.1.1-jre.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/handy-uri-templates-2.1.8.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/j2objc-annotations-1.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-annotations-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-core-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-databind-2.10.5.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-datatype-guava-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-datatype-jdk8-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-datatype-joda-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-datatype-jsr310-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jackson-module-parameter-names-2.10.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jakarta.annotation-api-1.3.5.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jakarta.inject-2.6.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jakarta.ws.rs-api-2.1.6.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jersey-common-2.31.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/joda-time-2.9.9.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/json-20190722.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/jsr305-1.3.9.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-avro-serializer-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-connect-avro-converter-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-connect-avro-data-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-connect-json-schema-converter-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-connect-protobuf-converter-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-json-schema-provider-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-json-schema-serializer-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-json-serializer-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-protobuf-provider-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-protobuf-serializer-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-schema-registry-client-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-schema-serializer-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-streams-5.5.3-ccs.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-streams-avro-serde-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-streams-json-schema-serde-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kafka-streams-protobuf-serde-5.5.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-reflect-1.3.50.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-script-runtime-1.3.50.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-scripting-common-1.3.50.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-scripting-compiler-embeddable-1.3.50.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-scripting-compiler-impl-embeddable-1.3.50.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-scripting-jvm-1.3.50.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-stdlib-1.4.0.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-stdlib-common-1.3.71.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-stdlib-jdk7-1.3.71.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlin-stdlib-jdk8-1.3.71.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlinx-coroutines-core-1.1.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/kotlinx-coroutines-core-common-1.1.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/mbknor-jackson-jsonschema_2.12-1.0.39.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/okio-2.5.0.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/org.everit.json.schema-1.12.1.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/osgi-resource-locator-1.0.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/protobuf-java-3.11.4.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/protobuf-java-util-3.11.4.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/re2j-1.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/rocksdbjni-5.18.3.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/scala-library-2.12.10.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/swagger-annotations-1.6.2.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/validation-api-2.0.1.Final.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/wire-runtime-3.2.2.jar:/opt/landoop/kafka/share/java/kafka-serde-tools/wire-schema-3.2.2.jar:/opt/landoop/kafka/share/java/confluent-common/build-tools-5.5.3.jar:/opt/landoop/kafka/share/java/confluent-common/common-config-5.5.3.jar:/opt/landoop/kafka/share/java/confluent-common/common-metrics-5.5.3.jar:/opt/landoop/kafka/share/java/confluent-common/common-utils-5.5.3.jar:/opt/landoop/kafka/share/java/confluent-common/slf4j-api-1.7.26.jar:/opt/landoop/kafka/share/java/landoop-common/aggdesigner-algorithm-6.0.jar:/opt/landoop/kafka/share/java/landoop-common/annotations-13.0.jar:/opt/landoop/kafka/share/java/landoop-common/antlr4-runtime-4.7.jar:/opt/landoop/kafka/share/java/landoop-common/aopalliance-repackaged-2.6.1.jar:/opt/landoop/kafka/share/java/landoop-common/asm-7.2.jar:/opt/landoop/kafka/share/java/landoop-common/asm-analysis-7.2.jar:/opt/landoop/kafka/share/java/landoop-common/asm-commons-7.2.jar:/opt/landoop/kafka/share/java/landoop-common/asm-tree-7.2.jar:/opt/landoop/kafka/share/java/landoop-common/avatica-core-1.9.0.jar:/opt/landoop/kafka/share/java/landoop-common/avatica-metrics-1.9.0.jar:/opt/landoop/kafka/share/java/landoop-common/avro-1.9.2.jar:/opt/landoop/kafka/share/java/landoop-common/avro4s-core_2.12-1.7.0.jar:/opt/landoop/kafka/share/java/landoop-common/avro4s-macros_2.12-1.7.0.jar:/opt/landoop/kafka/share/java/landoop-common/calcite-core-1.12.0.jar:/opt/landoop/kafka/share/java/landoop-common/calcite-linq4j-1.8.0.jar:/opt/landoop/kafka/share/java/landoop-common/classgraph-4.8.21.jar:/opt/landoop/kafka/share/java/landoop-common/classmate-1.3.4.jar:/opt/landoop/kafka/share/java/landoop-common/common-config-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/common-metrics-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/common-utils-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/commons-collections-3.2.2.jar:/opt/landoop/kafka/share/java/landoop-common/commons-compiler-2.7.6.jar:/opt/landoop/kafka/share/java/landoop-common/commons-compress-1.19.jar:/opt/landoop/kafka/share/java/landoop-common/commons-dbcp-1.4.jar:/opt/landoop/kafka/share/java/landoop-common/commons-digester-1.8.1.jar:/opt/landoop/kafka/share/java/landoop-common/commons-lang-2.4.jar:/opt/landoop/kafka/share/java/landoop-common/commons-lang3-3.2.1.jar:/opt/landoop/kafka/share/java/landoop-common/commons-logging-1.2.jar:/opt/landoop/kafka/share/java/landoop-common/commons-pool-1.5.4.jar:/opt/landoop/kafka/share/java/landoop-common/commons-validator-1.6.jar:/opt/landoop/kafka/share/java/landoop-common/connect-api-2.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/connect-json-2.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/error_prone_annotations-2.3.4.jar:/opt/landoop/kafka/share/java/landoop-common/gson-2.8.6.jar:/opt/landoop/kafka/share/java/landoop-common/guava-19.0.jar:/opt/landoop/kafka/share/java/landoop-common/handy-uri-templates-2.1.8.jar:/opt/landoop/kafka/share/java/landoop-common/hibernate-validator-6.0.17.Final.jar:/opt/landoop/kafka/share/java/landoop-common/hk2-api-2.6.1.jar:/opt/landoop/kafka/share/java/landoop-common/hk2-locator-2.6.1.jar:/opt/landoop/kafka/share/java/landoop-common/hk2-utils-2.6.1.jar:/opt/landoop/kafka/share/java/landoop-common/jackson-dataformat-yaml-2.4.5.jar:/opt/landoop/kafka/share/java/landoop-common/jackson-datatype-guava-2.10.2.jar:/opt/landoop/kafka/share/java/landoop-common/jackson-datatype-joda-2.4.5.jar:/opt/landoop/kafka/share/java/landoop-common/jackson-datatype-jsr310-2.10.2.jar:/opt/landoop/kafka/share/java/landoop-common/jackson-module-parameter-names-2.10.2.jar:/opt/landoop/kafka/share/java/landoop-common/jakarta.annotation-api-1.3.5.jar:/opt/landoop/kafka/share/java/landoop-common/jakarta.el-3.0.2.jar:/opt/landoop/kafka/share/java/landoop-common/jakarta.el-api-3.0.3.jar:/opt/landoop/kafka/share/java/landoop-common/jakarta.inject-2.6.1.jar:/opt/landoop/kafka/share/java/landoop-common/jakarta.validation-api-2.0.2.jar:/opt/landoop/kafka/share/java/landoop-common/jakarta.ws.rs-api-2.1.6.jar:/opt/landoop/kafka/share/java/landoop-common/janino-2.7.6.jar:/opt/landoop/kafka/share/java/landoop-common/javassist-3.25.0-GA.jar:/opt/landoop/kafka/share/java/landoop-common/javax-websocket-client-impl-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/javax-websocket-server-impl-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/javax.annotation-api-1.3.jar:/opt/landoop/kafka/share/java/landoop-common/javax.websocket-api-1.0.jar:/opt/landoop/kafka/share/java/landoop-common/javax.websocket-client-api-1.0.jar:/opt/landoop/kafka/share/java/landoop-common/jboss-logging-3.3.2.Final.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-bean-validation-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-client-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-common-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-container-servlet-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-container-servlet-core-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-hk2-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-media-jaxb-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jersey-server-2.30.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-annotations-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-jaas-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-jmx-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-jndi-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-plus-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-webapp-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/jetty-xml-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/joda-time-2.10.2.jar:/opt/landoop/kafka/share/java/landoop-common/json-20190722.jar:/opt/landoop/kafka/share/java/landoop-common/json-sql_2.12-2.0.0.jar:/opt/landoop/kafka/share/java/landoop-common/json4s-ast_2.12-3.6.7.jar:/opt/landoop/kafka/share/java/landoop-common/json4s-core_2.12-3.6.7.jar:/opt/landoop/kafka/share/java/landoop-common/json4s-jackson_2.12-3.6.7.jar:/opt/landoop/kafka/share/java/landoop-common/json4s-native_2.12-3.6.7.jar:/opt/landoop/kafka/share/java/landoop-common/json4s-scalap_2.12-3.6.7.jar:/opt/landoop/kafka/share/java/landoop-common/jsr305-3.0.1.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-avro-serializer-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-clients-5.5.0-ccs.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-connect-avro-converter-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-connect-avro-data-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-connect-smt_2.12-2.0.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-json-schema-provider-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-protobuf-provider-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-schema-registry-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-schema-registry-client-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka-schema-serializer-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/kafka_2.12-5.5.0-ccs.jar:/opt/landoop/kafka/share/java/landoop-common/kcql-2.8.7.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-reflect-1.3.50.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-script-runtime-1.3.50.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-scripting-common-1.3.50.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-scripting-compiler-embeddable-1.3.50.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-scripting-compiler-impl-embeddable-1.3.50.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-scripting-jvm-1.3.50.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-stdlib-1.3.61.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-stdlib-common-1.3.61.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-stdlib-jdk7-1.3.61.jar:/opt/landoop/kafka/share/java/landoop-common/kotlin-stdlib-jdk8-1.3.61.jar:/opt/landoop/kafka/share/java/landoop-common/kotlinx-coroutines-core-1.1.1.jar:/opt/landoop/kafka/share/java/landoop-common/macro-compat_2.12-1.1.1.jar:/opt/landoop/kafka/share/java/landoop-common/mbknor-jackson-jsonschema_2.12-1.0.36.jar:/opt/landoop/kafka/share/java/landoop-common/netty-buffer-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-codec-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-common-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-handler-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-resolver-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-transport-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-transport-native-epoll-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/netty-transport-native-unix-common-4.1.45.Final.jar:/opt/landoop/kafka/share/java/landoop-common/okio-jvm-2.4.3.jar:/opt/landoop/kafka/share/java/landoop-common/org.everit.json.schema-1.12.1.jar:/opt/landoop/kafka/share/java/landoop-common/osgi-resource-locator-1.0.3.jar:/opt/landoop/kafka/share/java/landoop-common/protobuf-java-3.11.4.jar:/opt/landoop/kafka/share/java/landoop-common/protobuf-java-util-3.11.4.jar:/opt/landoop/kafka/share/java/landoop-common/re2j-1.3.jar:/opt/landoop/kafka/share/java/landoop-common/rest-utils-5.5.0.jar:/opt/landoop/kafka/share/java/landoop-common/shapeless_2.12-2.3.2.jar:/opt/landoop/kafka/share/java/landoop-common/slf4j-log4j12-1.7.26.jar:/opt/landoop/kafka/share/java/landoop-common/snakeyaml-1.12.jar:/opt/landoop/kafka/share/java/landoop-common/sql-core_2.12-2.0.jar:/opt/landoop/kafka/share/java/landoop-common/swagger-annotations-1.5.22.jar:/opt/landoop/kafka/share/java/landoop-common/swagger-core-1.5.3.jar:/opt/landoop/kafka/share/java/landoop-common/swagger-models-1.5.3.jar:/opt/landoop/kafka/share/java/landoop-common/websocket-api-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/websocket-client-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/websocket-common-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/websocket-server-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/websocket-servlet-9.4.24.v20191120.jar:/opt/landoop/kafka/share/java/landoop-common/wire-runtime-jvm-3.1.0.jar:/opt/landoop/kafka/share/java/landoop-common/wire-schema-jvm-3.1.0.jar:/opt/landoop/kafka/share/java/landoop-common/zkclient-0.11.jar:/opt/landoop/kafka/share/java/landoop-common/zookeeper-3.5.7.jar:/opt/landoop/kafka/share/java/landoop-common/zookeeper-jute-3.5.7.jar\n\tos.spec = Linux, amd64, 3.10.0-693.17.1.el7.x86_64\n\tos.vcpus = 32\n (org.apache.kafka.connect.runtime.WorkerInfo:71)\n[2021-03-06 06:47:15,949] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectStandalone:78)\n[2021-03-06 06:47:15,974] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-couchbase (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:16,821] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-couchbase/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:16,822] INFO Added plugin 'com.couchbase.connect.kafka.CouchbaseSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:16,822] INFO Added plugin 'com.couchbase.connect.kafka.CouchbaseSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:16,823] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:16,823] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:16,823] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:16,851] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-debezium-mongodb (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:17,106] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-debezium-mongodb/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:17,106] INFO Added plugin 'io.debezium.connector.mongodb.MongoDbConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,106] INFO Added plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,106] INFO Added plugin 'io.debezium.transforms.ExtractNewRecordState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,107] INFO Added plugin 'io.debezium.transforms.outbox.EventRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,121] INFO Added plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,122] INFO Added plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,122] INFO Added plugin 'io.debezium.connector.mongodb.transforms.UnwrapFromMongoDbEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,122] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,135] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-debezium-mysql (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:17,594] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-debezium-mysql/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:17,594] INFO Added plugin 'io.debezium.connector.mysql.MySqlConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,602] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-debezium-postgres (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:17,814] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-debezium-postgres/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:17,814] INFO Added plugin 'io.debezium.connector.postgresql.PostgresConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:17,822] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-debezium-sqlserver (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:18,017] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-debezium-sqlserver/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:18,018] INFO Added plugin 'io.debezium.connector.sqlserver.SqlServerConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:18,045] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-elasticsearch (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:18,174] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-elasticsearch/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:18,174] INFO Added plugin 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:18,176] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-hdfs (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:20,802] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-hdfs/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:20,803] INFO Added plugin 'io.confluent.connect.hdfs.HdfsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:20,803] INFO Added plugin 'io.confluent.connect.storage.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:20,803] INFO Added plugin 'io.confluent.connect.hdfs.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:20,803] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:20,890] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-jdbc (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:21,010] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-jdbc/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:21,010] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:21,010] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:21,023] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-s3 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:21,889] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-s3/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:21,889] INFO Added plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:21,891] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-splunk (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:22,093] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-splunk/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:22,093] INFO Added plugin 'com.splunk.kafka.connect.SplunkSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:22,093] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:22,093] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:22,095] INFO Loading plugin from: /opt/landoop/connectors/third-party/kafka-connect-twitter (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:239)\n[2021-03-06 06:47:23,175] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/landoop/connectors/third-party/kafka-connect-twitter/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:23,175] INFO Added plugin 'com.eneco.trading.kafka.connect.twitter.TwitterSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:23,176] INFO Added plugin 'com.eneco.trading.kafka.connect.twitter.TwitterSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,456] INFO Registered loader: sun.misc.Launcher$AppClassLoader@764c12b6 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:262)\n[2021-03-06 06:47:25,456] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,456] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,456] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,457] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,457] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,457] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,457] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,457] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,458] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,458] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,458] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,458] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,458] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'io.confluent.connect.protobuf.ProtobufConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'io.confluent.connect.json.JsonSchemaConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,459] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'com.landoop.connect.sql.Transformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,460] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,461] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,462] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,462] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,462] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,462] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,462] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,462] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:191)\n[2021-03-06 06:47:25,464] INFO Added aliases 'CouchbaseSinkConnector' and 'CouchbaseSink' to plugin 'com.couchbase.connect.kafka.CouchbaseSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,465] INFO Added aliases 'CouchbaseSourceConnector' and 'CouchbaseSource' to plugin 'com.couchbase.connect.kafka.CouchbaseSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,465] INFO Added aliases 'TwitterSinkConnector' and 'TwitterSink' to plugin 'com.eneco.trading.kafka.connect.twitter.TwitterSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,465] INFO Added aliases 'TwitterSourceConnector' and 'TwitterSource' to plugin 'com.eneco.trading.kafka.connect.twitter.TwitterSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,466] INFO Added aliases 'SplunkSinkConnector' and 'SplunkSink' to plugin 'com.splunk.kafka.connect.SplunkSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,466] INFO Added aliases 'ElasticsearchSinkConnector' and 'ElasticsearchSink' to plugin 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,466] INFO Added aliases 'HdfsSinkConnector' and 'HdfsSink' to plugin 'io.confluent.connect.hdfs.HdfsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,466] INFO Added aliases 'JdbcSinkConnector' and 'JdbcSink' to plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,466] INFO Added aliases 'JdbcSourceConnector' and 'JdbcSource' to plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,467] INFO Added aliases 'S3SinkConnector' and 'S3Sink' to plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,467] INFO Added aliases 'MongoDbConnector' and 'MongoDb' to plugin 'io.debezium.connector.mongodb.MongoDbConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,467] INFO Added aliases 'MySqlConnector' and 'MySql' to plugin 'io.debezium.connector.mysql.MySqlConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,467] INFO Added aliases 'PostgresConnector' and 'Postgres' to plugin 'io.debezium.connector.postgresql.PostgresConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,467] INFO Added aliases 'SqlServerConnector' and 'SqlServer' to plugin 'io.debezium.connector.sqlserver.SqlServerConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,468] INFO Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,468] INFO Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,468] INFO Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,468] INFO Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,468] INFO Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,468] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,469] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,469] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,470] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,470] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,470] INFO Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,470] INFO Added aliases 'JsonSchemaConverter' and 'JsonSchema' to plugin 'io.confluent.connect.json.JsonSchemaConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,470] INFO Added aliases 'ProtobufConverter' and 'Protobuf' to plugin 'io.confluent.connect.protobuf.ProtobufConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,471] INFO Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,471] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,471] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,472] INFO Added aliases 'ByteBufferConverter' and 'ByteBuffer' to plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,473] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,474] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,474] INFO Added alias 'Transformation' to plugin 'com.landoop.connect.sql.Transformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,474] INFO Added alias 'ExtractNewDocumentState' to plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,474] INFO Added alias 'UnwrapFromMongoDbEnvelope' to plugin 'io.debezium.connector.mongodb.transforms.UnwrapFromMongoDbEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,474] INFO Added alias 'ByLogicalTableRouter' to plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,475] INFO Added alias 'ExtractNewRecordState' to plugin 'io.debezium.transforms.ExtractNewRecordState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,475] INFO Added alias 'UnwrapFromEnvelope' to plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,475] INFO Added alias 'EventRouter' to plugin 'io.debezium.transforms.outbox.EventRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,476] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,476] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,476] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,476] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:416)\n[2021-03-06 06:47:25,476] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,477] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,477] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:419)\n[2021-03-06 06:47:25,499] INFO StandaloneConfig values: \n\taccess.control.allow.methods = \n\taccess.control.allow.origin = \n\tadmin.listeners = null\n\tbootstrap.servers = [127.0.0.1:9092]\n\tclient.dns.lookup = default\n\tconfig.providers = []\n\tconnector.client.config.override.policy = None\n\theader.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter\n\tinternal.key.converter = class org.apache.kafka.connect.json.JsonConverter\n\tinternal.value.converter = class org.apache.kafka.connect.json.JsonConverter\n\tkey.converter = class io.confluent.connect.avro.AvroConverter\n\tlisteners = null\n\tmetric.reporters = []\n\tmetrics.num.samples = 2\n\tmetrics.recording.level = INFO\n\tmetrics.sample.window.ms = 30000\n\toffset.flush.interval.ms = 5000\n\toffset.flush.timeout.ms = 5000\n\toffset.storage.file.filename = coyote_connect.offset\n\tplugin.path = [/opt/landoop/connectors/third-party]\n\trest.advertised.host.name = null\n\trest.advertised.listener = null\n\trest.advertised.port = null\n\trest.extension.classes = []\n\trest.host.name = null\n\trest.port = 38783\n\tssl.cipher.suites = null\n\tssl.client.auth = none\n\tssl.enabled.protocols = [TLSv1.2]\n\tssl.endpoint.identification.algorithm = https\n\tssl.key.password = null\n\tssl.keymanager.algorithm = SunX509\n\tssl.keystore.location = null\n\tssl.keystore.password = null\n\tssl.keystore.type = JKS\n\tssl.protocol = TLSv1.2\n\tssl.provider = null\n\tssl.secure.random.implementation = null\n\tssl.trustmanager.algorithm = PKIX\n\tssl.truststore.location = null\n\tssl.truststore.password = null\n\tssl.truststore.type = JKS\n\ttask.shutdown.graceful.timeout.ms = 5000\n\ttopic.tracking.allow.reset = true\n\ttopic.tracking.enable = true\n\tvalue.converter = class io.confluent.connect.avro.AvroConverter\n (org.apache.kafka.connect.runtime.standalone.StandaloneConfig:347)\n[2021-03-06 06:47:25,499] INFO Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:363)\n[2021-03-06 06:47:25,500] INFO Worker configuration property 'internal.key.converter.schemas.enable' (along with all configuration for 'internal.key.converter') is deprecated and may be removed in an upcoming release. The specified value 'false' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:363)\n[2021-03-06 06:47:25,500] INFO Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:363)\n[2021-03-06 06:47:25,500] INFO Worker configuration property 'internal.value.converter.schemas.enable' (along with all configuration for 'internal.value.converter') is deprecated and may be removed in an upcoming release. The specified value 'false' matches the default, so this property can be safely removed from the worker configuration. (org.apache.kafka.connect.runtime.WorkerConfig:363)\n[2021-03-06 06:47:25,501] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:43)\n[2021-03-06 06:47:25,504] INFO AdminClientConfig values: \n\tbootstrap.servers = [127.0.0.1:9092]\n\tclient.dns.lookup = default\n\tclient.id = \n\tconnections.max.idle.ms = 300000\n\tdefault.api.timeout.ms = 60000\n\tmetadata.max.age.ms = 300000\n\tmetric.reporters = []\n\tmetrics.num.samples = 2\n\tmetrics.recording.level = INFO\n\tmetrics.sample.window.ms = 30000\n\treceive.buffer.bytes = 65536\n\treconnect.backoff.max.ms = 1000\n\treconnect.backoff.ms = 50\n\trequest.timeout.ms = 30000\n\tretries = 2147483647\n\tretry.backoff.ms = 100\n\tsasl.client.callback.handler.class = null\n\tsasl.jaas.config = null\n\tsasl.kerberos.kinit.cmd = /usr/bin/kinit\n\tsasl.kerberos.min.time.before.relogin = 60000\n\tsasl.kerberos.service.name = null\n\tsasl.kerberos.ticket.renew.jitter = 0.05\n\tsasl.kerberos.ticket.renew.window.factor = 0.8\n\tsasl.login.callback.handler.class = null\n\tsasl.login.class = null\n\tsasl.login.refresh.buffer.seconds = 300\n\tsasl.login.refresh.min.period.seconds = 60\n\tsasl.login.refresh.window.factor = 0.8\n\tsasl.login.refresh.window.jitter = 0.05\n\tsasl.mechanism = GSSAPI\n\tsecurity.protocol = PLAINTEXT\n\tsecurity.providers = null\n\tsend.buffer.bytes = 131072\n\tssl.cipher.suites = null\n\tssl.enabled.protocols = [TLSv1.2]\n\tssl.endpoint.identification.algorithm = https\n\tssl.key.password = null\n\tssl.keymanager.algorithm = SunX509\n\tssl.keystore.location = null\n\tssl.keystore.password = null\n\tssl.keystore.type = JKS\n\tssl.protocol = TLSv1.2\n\tssl.provider = null\n\tssl.secure.random.implementation = null\n\tssl.trustmanager.algorithm = PKIX\n\tssl.truststore.location = null\n\tssl.truststore.password = null\n\tssl.truststore.type = JKS\n (org.apache.kafka.clients.admin.AdminClientConfig:347)\n[2021-03-06 06:47:25,571] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,571] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,571] WARN The configuration 'rest.port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,571] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,571] WARN The configuration 'value.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'offset.storage.file.filename' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'port' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,572] WARN The configuration 'key.converter.schema.registry.url' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:355)\n[2021-03-06 06:47:25,573] INFO Kafka version: 2.5.1-L0 (org.apache.kafka.common.utils.AppInfoParser:117)\n[2021-03-06 06:47:25,573] INFO Kafka commitId: 0efa8fb0f4c73d92 (org.apache.kafka.common.utils.AppInfoParser:118)\n[2021-03-06 06:47:25,573] INFO Kafka startTimeMs: 1615013245572 (org.apache.kafka.common.utils.AppInfoParser:119)\n[2021-03-06 06:47:26,186] INFO Kafka cluster ID: HfUmm1LuTOSlFvzO39QCXQ (org.apache.kafka.connect.util.ConnectUtils:59)\n[2021-03-06 06:47:26,220] INFO Logging initialized @11055ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log:169)\n[2021-03-06 06:47:26,304] INFO Added connector for http://:38783 (org.apache.kafka.connect.runtime.rest.RestServer:131)\n[2021-03-06 06:47:26,305] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:203)\n[2021-03-06 06:47:26,314] INFO jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 1.8.0_275-b01 (org.eclipse.jetty.server.Server:359)\n[2021-03-06 06:47:26,353] INFO Started http_38783@2b4954a4{HTTP/1.1,[http/1.1]}{0.0.0.0:38783} (org.eclipse.jetty.server.AbstractConnector:330)\n[2021-03-06 06:47:26,353] INFO Started @11189ms (org.eclipse.jetty.server.Server:399)\n[2021-03-06 06:47:26,379] INFO Advertised URI: http://172.17.0.2:38783/ (org.apache.kafka.connect.runtime.rest.RestServer:365)\n[2021-03-06 06:47:26,379] INFO REST server listening at http://172.17.0.2:38783/, advertising URL http://172.17.0.2:38783/ (org.apache.kafka.connect.runtime.rest.RestServer:218)\n[2021-03-06 06:47:26,379] INFO Advertised URI: http://172.17.0.2:38783/ (org.apache.kafka.connect.runtime.rest.RestServer:365)\n[2021-03-06 06:47:26,379] INFO REST admin endpoints at http://172.17.0.2:38783/ (org.apache.kafka.connect.runtime.rest.RestServer:219)\n[2021-03-06 06:47:26,380] INFO Advertised URI: http://172.17.0.2:38783/ (org.apache.kafka.connect.runtime.rest.RestServer:365)\n[2021-03-06 06:47:26,380] INFO Setting up None Policy for ConnectorClientConfigOverride. This will disallow any client configuration to be overridden (org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy:45)\n[2021-03-06 06:47:26,394] INFO Kafka version: 2.5.1-L0 (org.apache.kafka.common.utils.AppInfoParser:117)\n[2021-03-06 06:47:26,394] INFO Kafka commitId: 0efa8fb0f4c73d92 (org.apache.kafka.common.utils.AppInfoParser:118)\n[2021-03-06 06:47:26,394] INFO Kafka startTimeMs: 1615013246394 (org.apache.kafka.common.utils.AppInfoParser:119)\n[2021-03-06 06:47:26,568] INFO JsonConverterConfig values: \n\tconverter.type = key\n\tdecimal.format = BASE64\n\tschemas.cache.size = 1000\n\tschemas.enable = false\n (org.apache.kafka.connect.json.JsonConverterConfig:347)\n[2021-03-06 06:47:26,569] INFO JsonConverterConfig values: \n\tconverter.type = value\n\tdecimal.format = BASE64\n\tschemas.cache.size = 1000\n\tschemas.enable = false\n (org.apache.kafka.connect.json.JsonConverterConfig:347)\n[2021-03-06 06:47:26,578] INFO Kafka Connect standalone worker initialization took 10639ms (org.apache.kafka.connect.cli.ConnectStandalone:100)\n[2021-03-06 06:47:26,579] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:51)\n[2021-03-06 06:47:26,580] INFO Herder starting (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:93)\n[2021-03-06 06:47:26,580] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:184)\n[2021-03-06 06:47:26,580] INFO Starting FileOffsetBackingStore with file coyote_connect.offset (org.apache.kafka.connect.storage.FileOffsetBackingStore:58)\n[2021-03-06 06:47:26,584] INFO Worker started (org.apache.kafka.connect.runtime.Worker:191)\n[2021-03-06 06:47:26,584] INFO Herder started (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95)\n[2021-03-06 06:47:26,584] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:223)\n[2021-03-06 06:47:26,628] INFO Adding admin resources to main listener (org.apache.kafka.connect.runtime.rest.RestServer:240)\n[2021-03-06 06:47:26,714] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:333)\n[2021-03-06 06:47:26,714] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:338)\n[2021-03-06 06:47:26,716] INFO node0 Scavenging every 600000ms (org.eclipse.jetty.server.session:140)\n[2021-03-06 06:47:27,453] INFO HV000001: Hibernate Validator 6.0.17.Final (org.hibernate.validator.internal.util.Version:21)\n[2021-03-06 06:47:27,811] INFO Started o.e.j.s.ServletContextHandler@10e4cc6{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:825)\n[2021-03-06 06:47:27,811] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:313)\n[2021-03-06 06:47:27,811] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:57)\n[2021-03-06 06:47:27,836] INFO AbstractConfig values: \n\tbatch.max.rows = 100\n\tcatalog.pattern = null\n\tconnection.attempts = 3\n\tconnection.backoff.ms = 10000\n\tconnection.password = null\n\tconnection.url = jdbc:sqlite:coyote_test.sqlite\n\tconnection.user = null\n\tdb.timezone = UTC\n\tdialect.name = \n\tincrementing.column.name = id\n\tmode = incrementing\n\tnumeric.mapping = null\n\tnumeric.precision.mapping = false\n\tpoll.interval.ms = 5000\n\tquery = \n\tquery.suffix = \n\tquote.sql.identifiers = ALWAYS\n\tschema.pattern = null\n\ttable.blacklist = []\n\ttable.poll.interval.ms = 60000\n\ttable.types = [TABLE]\n\ttable.whitelist = []\n\ttimestamp.column.name = []\n\ttimestamp.delay.interval.ms = 0\n\ttimestamp.initial = null\n\ttopic.prefix = coyote-ca-\n\tvalidate.non.null = true\n (org.apache.kafka.common.config.AbstractConfig:347)\n[2021-03-06 06:47:27,934] INFO AbstractConfig values: \n (org.apache.kafka.common.config.AbstractConfig:347)\n[2021-03-06 06:47:27,942] INFO ConnectorConfig values: \n\tconfig.action.reload = restart\n\tconnector.class = io.confluent.connect.jdbc.JdbcSourceConnector\n\terrors.log.enable = false\n\terrors.log.include.messages = false\n\terrors.retry.delay.max.ms = 60000\n\terrors.retry.timeout = 0\n\terrors.tolerance = none\n\theader.converter = null\n\tkey.converter = null\n\tname = coyote-ca-1615013233026\n\ttasks.max = 1\n\ttransforms = []\n\tvalue.converter = null\n (org.apache.kafka.connect.runtime.ConnectorConfig:347)\n[2021-03-06 06:47:27,943] INFO EnrichedConnectorConfig values: \n\tconfig.action.reload = restart\n\tconnector.class = io.confluent.connect.jdbc.JdbcSourceConnector\n\terrors.log.enable = false\n\terrors.log.include.messages = false\n\terrors.retry.delay.max.ms = 60000\n\terrors.retry.timeout = 0\n\terrors.tolerance = none\n\theader.converter = null\n\tkey.converter = null\n\tname = coyote-ca-1615013233026\n\ttasks.max = 1\n\ttransforms = []\n\tvalue.converter = null\n (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)\n[2021-03-06 06:47:27,943] INFO Creating connector coyote-ca-1615013233026 of type io.confluent.connect.jdbc.JdbcSourceConnector (org.apache.kafka.connect.runtime.Worker:253)\n[2021-03-06 06:47:27,946] INFO Instantiated connector coyote-ca-1615013233026 with version 5.5.3 of type class io.confluent.connect.jdbc.JdbcSourceConnector (org.apache.kafka.connect.runtime.Worker:256)\n[2021-03-06 06:47:27,947] INFO Starting JDBC Source Connector (io.confluent.connect.jdbc.JdbcSourceConnector:69)\n[2021-03-06 06:47:27,947] INFO JdbcSourceConnectorConfig values: \n\tbatch.max.rows = 100\n\tcatalog.pattern = null\n\tconnection.attempts = 3\n\tconnection.backoff.ms = 10000\n\tconnection.password = null\n\tconnection.url = jdbc:sqlite:coyote_test.sqlite\n\tconnection.user = null\n\tdb.timezone = UTC\n\tdialect.name = \n\tincrementing.column.name = id\n\tmode = incrementing\n\tnumeric.mapping = null\n\tnumeric.precision.mapping = false\n\tpoll.interval.ms = 5000\n\tquery = \n\tquery.suffix = \n\tquote.sql.identifiers = ALWAYS\n\tschema.pattern = null\n\ttable.blacklist = []\n\ttable.poll.interval.ms = 60000\n\ttable.types = [TABLE]\n\ttable.whitelist = []\n\ttimestamp.column.name = []\n\ttimestamp.delay.interval.ms = 0\n\ttimestamp.initial = null\n\ttopic.prefix = coyote-ca-\n\tvalidate.non.null = true\n (io.confluent.connect.jdbc.source.JdbcSourceConnectorConfig:347)\n[2021-03-06 06:47:27,948] INFO Attempting to open connection #1 to Sqlite (io.confluent.connect.jdbc.util.CachedConnectionProvider:92)\n[2021-03-06 06:47:27,951] INFO Starting thread to monitor tables. (io.confluent.connect.jdbc.source.TableMonitorThread:73)\n[2021-03-06 06:47:27,952] INFO Finished creating connector coyote-ca-1615013233026 (org.apache.kafka.connect.runtime.Worker:275)\n[2021-03-06 06:47:27,955] INFO SourceConnectorConfig values: \n\tconfig.action.reload = restart\n\tconnector.class = io.confluent.connect.jdbc.JdbcSourceConnector\n\terrors.log.enable = false\n\terrors.log.include.messages = false\n\terrors.retry.delay.max.ms = 60000\n\terrors.retry.timeout = 0\n\terrors.tolerance = none\n\theader.converter = null\n\tkey.converter = null\n\tname = coyote-ca-1615013233026\n\ttasks.max = 1\n\ttransforms = []\n\tvalue.converter = null\n (org.apache.kafka.connect.runtime.SourceConnectorConfig:347)\n[2021-03-06 06:47:27,956] INFO EnrichedConnectorConfig values: \n\tconfig.action.reload = restart\n\tconnector.class = io.confluent.connect.jdbc.JdbcSourceConnector\n\terrors.log.enable = false\n\terrors.log.include.messages = false\n\terrors.retry.delay.max.ms = 60000\n\terrors.retry.timeout = 0\n\terrors.tolerance = none\n\theader.converter = null\n\tkey.converter = null\n\tname = coyote-ca-1615013233026\n\ttasks.max = 1\n\ttransforms = []\n\tvalue.converter = null\n (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)\n[2021-03-06 06:47:27,971] INFO Creating task coyote-ca-1615013233026-0 (org.apache.kafka.connect.runtime.Worker:421)\n[2021-03-06 06:47:27,973] INFO ConnectorConfig values: \n\tconfig.action.reload = restart\n\tconnector.class = io.confluent.connect.jdbc.JdbcSourceConnector\n\terrors.log.enable = false\n\terrors.log.include.messages = false\n\terrors.retry.delay.max.ms = 60000\n\terrors.retry.timeout = 0\n\terrors.tolerance = none\n\theader.converter = null\n\tkey.converter = null\n\tname = coyote-ca-1615013233026\n\ttasks.max = 1\n\ttransforms = []\n\tvalue.converter = null\n (org.apache.kafka.connect.runtime.ConnectorConfig:347)\n[2021-03-06 06:47:27,973] INFO EnrichedConnectorConfig values: \n\tconfig.action.reload = restart\n\tconnector.class = io.confluent.connect.jdbc.JdbcSourceConnector\n\terrors.log.enable = false\n\terrors.log.include.messages = false\n\terrors.retry.delay.max.ms = 60000\n\terrors.retry.timeout = 0\n\terrors.tolerance = none\n\theader.converter = null\n\tkey.converter = null\n\tname = coyote-ca-1615013233026\n\ttasks.max = 1\n\ttransforms = []\n\tvalue.converter = null\n (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:347)\n[2021-03-06 06:47:27,975] INFO TaskConfig values: \n\ttask.class = class io.confluent.connect.jdbc.source.JdbcSourceTask\n (org.apache.kafka.connect.runtime.TaskConfig:347)\n[2021-03-06 06:47:27,975] INFO Instantiated task coyote-ca-1615013233026-0 with version 5.5.3 of type io.confluent.connect.jdbc.source.JdbcSourceTask (org.apache.kafka.connect.runtime.Worker:436)\n[2021-03-06 06:47:27,983] INFO AvroConverterConfig values: \n\tbearer.auth.token = [hidden]\n\tschema.registry.ssl.truststore.type = JKS\n\tschema.reflection = false\n\tauto.register.schemas = true\n\tbasic.auth.credentials.source = URL\n\tschema.registry.ssl.keystore.password = [hidden]\n\tschema.registry.ssl.provider = \n\tschema.registry.ssl.endpoint.identification.algorithm = https\n\tschema.registry.ssl.truststore.location = \n\tvalue.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n\tschema.registry.url = [http://127.0.0.1:8081]\n\tschema.registry.ssl.keystore.location = \n\tschema.registry.ssl.trustmanager.algorithm = PKIX\n\tschema.registry.ssl.key.password = [hidden]\n\tschema.registry.ssl.keystore.type = JKS\n\tproxy.port = -1\n\tschema.registry.ssl.secure.random.implementation = \n\tschema.registry.ssl.cipher.suites = []\n\tmax.schemas.per.subject = 1000\n\tschema.registry.ssl.truststore.password = [hidden]\n\tbasic.auth.user.info = [hidden]\n\tproxy.host = \n\tuse.latest.version = false\n\tschema.registry.ssl.enabled.protocols = [TLSv1.2]\n\tschema.registry.ssl.protocol = TLSv1.2\n\tschema.registry.basic.auth.user.info = [hidden]\n\tbearer.auth.credentials.source = STATIC_TOKEN\n\tschema.registry.ssl.keymanager.algorithm = SunX509\n\tkey.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n (io.confluent.connect.avro.AvroConverterConfig:179)\n[2021-03-06 06:47:28,022] INFO KafkaAvroSerializerConfig values: \n\tbearer.auth.token = [hidden]\n\tschema.registry.ssl.truststore.type = JKS\n\tschema.reflection = false\n\tauto.register.schemas = true\n\tbasic.auth.credentials.source = URL\n\tschema.registry.ssl.keystore.password = [hidden]\n\tschema.registry.ssl.provider = \n\tschema.registry.ssl.endpoint.identification.algorithm = https\n\tschema.registry.ssl.truststore.location = \n\tvalue.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n\tschema.registry.url = [http://127.0.0.1:8081]\n\tschema.registry.ssl.keystore.location = \n\tschema.registry.ssl.trustmanager.algorithm = PKIX\n\tschema.registry.ssl.key.password = [hidden]\n\tschema.registry.ssl.keystore.type = JKS\n\tproxy.port = -1\n\tschema.registry.ssl.secure.random.implementation = \n\tschema.registry.ssl.cipher.suites = []\n\tmax.schemas.per.subject = 1000\n\tschema.registry.ssl.truststore.password = [hidden]\n\tbasic.auth.user.info = [hidden]\n\tproxy.host = \n\tuse.latest.version = false\n\tschema.registry.ssl.enabled.protocols = [TLSv1.2]\n\tschema.registry.ssl.protocol = TLSv1.2\n\tschema.registry.basic.auth.user.info = [hidden]\n\tbearer.auth.credentials.source = STATIC_TOKEN\n\tschema.registry.ssl.keymanager.algorithm = SunX509\n\tkey.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n (io.confluent.kafka.serializers.KafkaAvroSerializerConfig:179)\n[2021-03-06 06:47:28,027] INFO KafkaAvroDeserializerConfig values: \n\tbearer.auth.token = [hidden]\n\tschema.registry.ssl.truststore.type = JKS\n\tschema.reflection = false\n\tauto.register.schemas = true\n\tbasic.auth.credentials.source = URL\n\tschema.registry.ssl.keystore.password = [hidden]\n\tschema.registry.ssl.provider = \n\tschema.registry.ssl.endpoint.identification.algorithm = https\n\tschema.registry.ssl.truststore.location = \n\tspecific.avro.reader = false\n\tvalue.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n\tschema.registry.url = [http://127.0.0.1:8081]\n\tschema.registry.ssl.keystore.location = \n\tschema.registry.ssl.trustmanager.algorithm = PKIX\n\tschema.registry.ssl.key.password = [hidden]\n\tschema.registry.ssl.keystore.type = JKS\n\tproxy.port = -1\n\tschema.registry.ssl.secure.random.implementation = \n\tschema.registry.ssl.cipher.suites = []\n\tmax.schemas.per.subject = 1000\n\tschema.registry.ssl.truststore.password = [hidden]\n\tbasic.auth.user.info = [hidden]\n\tproxy.host = \n\tuse.latest.version = false\n\tschema.registry.ssl.enabled.protocols = [TLSv1.2]\n\tschema.registry.ssl.protocol = TLSv1.2\n\tschema.registry.basic.auth.user.info = [hidden]\n\tbearer.auth.credentials.source = STATIC_TOKEN\n\tschema.registry.ssl.keymanager.algorithm = SunX509\n\tkey.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n (io.confluent.kafka.serializers.KafkaAvroDeserializerConfig:179)\n[2021-03-06 06:47:28,065] INFO AvroDataConfig values: \n\tconnect.meta.data = true\n\tenhanced.avro.schema.support = false\n\tschemas.cache.config = 1000\n (io.confluent.connect.avro.AvroDataConfig:347)\n[2021-03-06 06:47:28,066] INFO Set up the key converter class io.confluent.connect.avro.AvroConverter for task coyote-ca-1615013233026-0 using the worker config (org.apache.kafka.connect.runtime.Worker:449)\n[2021-03-06 06:47:28,066] INFO AvroConverterConfig values: \n\tbearer.auth.token = [hidden]\n\tschema.registry.ssl.truststore.type = JKS\n\tschema.reflection = false\n\tauto.register.schemas = true\n\tbasic.auth.credentials.source = URL\n\tschema.registry.ssl.keystore.password = [hidden]\n\tschema.registry.ssl.provider = \n\tschema.registry.ssl.endpoint.identification.algorithm = https\n\tschema.registry.ssl.truststore.location = \n\tvalue.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n\tschema.registry.url = [http://127.0.0.1:8081]\n\tschema.registry.ssl.keystore.location = \n\tschema.registry.ssl.trustmanager.algorithm = PKIX\n\tschema.registry.ssl.key.password = [hidden]\n\tschema.registry.ssl.keystore.type = JKS\n\tproxy.port = -1\n\tschema.registry.ssl.secure.random.implementation = \n\tschema.registry.ssl.cipher.suites = []\n\tmax.schemas.per.subject = 1000\n\tschema.registry.ssl.truststore.password = [hidden]\n\tbasic.auth.user.info = [hidden]\n\tproxy.host = \n\tuse.latest.version = false\n\tschema.registry.ssl.enabled.protocols = [TLSv1.2]\n\tschema.registry.ssl.protocol = TLSv1.2\n\tschema.registry.basic.auth.user.info = [hidden]\n\tbearer.auth.credentials.source = STATIC_TOKEN\n\tschema.registry.ssl.keymanager.algorithm = SunX509\n\tkey.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n (io.confluent.connect.avro.AvroConverterConfig:179)\n[2021-03-06 06:47:28,066] INFO KafkaAvroSerializerConfig values: \n\tbearer.auth.token = [hidden]\n\tschema.registry.ssl.truststore.type = JKS\n\tschema.reflection = false\n\tauto.register.schemas = true\n\tbasic.auth.credentials.source = URL\n\tschema.registry.ssl.keystore.password = [hidden]\n\tschema.registry.ssl.provider = \n\tschema.registry.ssl.endpoint.identification.algorithm = https\n\tschema.registry.ssl.truststore.location = \n\tvalue.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n\tschema.registry.url = [http://127.0.0.1:8081]\n\tschema.registry.ssl.keystore.location = \n\tschema.registry.ssl.trustmanager.algorithm = PKIX\n\tschema.registry.ssl.key.password = [hidden]\n\tschema.registry.ssl.keystore.type = JKS\n\tproxy.port = -1\n\tschema.registry.ssl.secure.random.implementation = \n\tschema.registry.ssl.cipher.suites = []\n\tmax.schemas.per.subject = 1000\n\tschema.registry.ssl.truststore.password = [hidden]\n\tbasic.auth.user.info = [hidden]\n\tproxy.host = \n\tuse.latest.version = false\n\tschema.registry.ssl.enabled.protocols = [TLSv1.2]\n\tschema.registry.ssl.protocol = TLSv1.2\n\tschema.registry.basic.auth.user.info = [hidden]\n\tbearer.auth.credentials.source = STATIC_TOKEN\n\tschema.registry.ssl.keymanager.algorithm = SunX509\n\tkey.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n (io.confluent.kafka.serializers.KafkaAvroSerializerConfig:179)\n[2021-03-06 06:47:28,067] INFO KafkaAvroDeserializerConfig values: \n\tbearer.auth.token = [hidden]\n\tschema.registry.ssl.truststore.type = JKS\n\tschema.reflection = false\n\tauto.register.schemas = true\n\tbasic.auth.credentials.source = URL\n\tschema.registry.ssl.keystore.password = [hidden]\n\tschema.registry.ssl.provider = \n\tschema.registry.ssl.endpoint.identification.algorithm = https\n\tschema.registry.ssl.truststore.location = \n\tspecific.avro.reader = false\n\tvalue.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n\tschema.registry.url = [http://127.0.0.1:8081]\n\tschema.registry.ssl.keystore.location = \n\tschema.registry.ssl.trustmanager.algorithm = PKIX\n\tschema.registry.ssl.key.password = [hidden]\n\tschema.registry.ssl.keystore.type = JKS\n\tproxy.port = -1\n\tschema.registry.ssl.secure.random.implementation = \n\tschema.registry.ssl.cipher.suites = []\n\tmax.schemas.per.subject = 1000\n\tschema.registry.ssl.truststore.password = [hidden]\n\tbasic.auth.user.info = [hidden]\n\tproxy.host = \n\tuse.latest.version = false\n\tschema.registry.ssl.enabled.protocols = [TLSv1.2]\n\tschema.registry.ssl.protocol = TLSv1.2\n\tschema.registry.basic.auth.user.info = [hidden]\n\tbearer.auth.credentials.source = STATIC_TOKEN\n\tschema.registry.ssl.keymanager.algorithm = SunX509\n\tkey.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy\n (io.confluent.kafka.serializers.KafkaAvroDeserializerConfig:179)\n[2021-03-06 06:47:28,067] INFO AvroDataConfig values: \n\tconnect.meta.data = true\n\tenhanced.avro.schema.support = false\n\tschemas.cache.config = 1000\n (io.confluent.connect.avro.AvroDataConfig:347)\n[2021-03-06 06:47:28,067] INFO Set up the value converter class io.confluent.connect.avro.AvroConverter for task coyote-ca-1615013233026-0 using the worker config (org.apache.kafka.connect.runtime.Worker:455)\n[2021-03-06 06:47:28,067] INFO Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task coyote-ca-1615013233026-0 using the worker config (org.apache.kafka.connect.runtime.Worker:462)\n[2021-03-06 06:47:28,072] INFO Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:516)\n[2021-03-06 06:47:28,081] INFO ProducerConfig values: \n\tacks = -1\n\tbatch.size = 16384\n\tbootstrap.servers = [127.0.0.1:9092]\n\tbuffer.memory = 33554432\n\tclient.dns.lookup = default\n\tclient.id = connector-producer-coyote-ca-1615013233026-0\n\tcompression.type = none\n\tconnections.max.idle.ms = 540000\n\tdelivery.timeout.ms = 2147483647\n\tenable.idempotence = false\n\tinterceptor.classes = []\n\tkey.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer\n\tlinger.ms = 0\n\tmax.block.ms = 9223372036854775807\n\tmax.in.flight.requests.per.connection = 1\n\tmax.request.size = 1048576\n\tmetadata.max.age.ms = 300000\n\tmetadata.max.idle.ms = 300000\n\tmetric.reporters = []\n\tmetrics.num.samples = 2\n\tmetrics.recording.level = INFO\n\tmetrics.sample.window.ms = 30000\n\tpartitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner\n\treceive.buffer.bytes = 32768\n\treconnect.backoff.max.ms = 1000\n\treconnect.backoff.ms = 50\n\trequest.timeout.ms = 2147483647\n\tretries = 2147483647\n\tretry.backoff.ms = 100\n\tsasl.client.callback.handler.class = null\n\tsasl.jaas.config = null\n\tsasl.kerberos.kinit.cmd = /usr/bin/kinit\n\tsasl.kerberos.min.time.before.relogin = 60000\n\tsasl.kerberos.service.name = null\n\tsasl.kerberos.ticket.renew.jitter = 0.05\n\tsasl.kerberos.ticket.renew.window.factor = 0.8\n\tsasl.login.callback.handler.class = null\n\tsasl.login.class = null\n\tsasl.login.refresh.buffer.seconds = 300\n\tsasl.login.refresh.min.period.seconds = 60\n\tsasl.login.refresh.window.factor = 0.8\n\tsasl.login.refresh.window.jitter = 0.05\n\tsasl.mechanism = GSSAPI\n\tsecurity.protocol = PLAINTEXT\n\tsecurity.providers = null\n\tsend.buffer.bytes = 131072\n\tssl.cipher.suites = null\n\tssl.enabled.protocols = [TLSv1.2]\n\tssl.endpoint.identification.algorithm = https\n\tssl.key.password = null\n\tssl.keymanager.algorithm = SunX509\n\tssl.keystore.location = null\n\tssl.keystore.password = null\n\tssl.keystore.type = JKS\n\tssl.protocol = TLSv1.2\n\tssl.provider = null\n\tssl.secure.random.implementation = null\n\tssl.trustmanager.algorithm = PKIX\n\tssl.truststore.location = null\n\tssl.truststore.password = null\n\tssl.truststore.type = JKS\n\ttransaction.timeout.ms = 60000\n\ttransactional.id = null\n\tvalue.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer\n (org.apache.kafka.clients.producer.ProducerConfig:347)\n[2021-03-06 06:47:28,107] INFO Kafka version: 2.5.1-L0 (org.apache.kafka.common.utils.AppInfoParser:117)\n[2021-03-06 06:47:28,107] INFO Kafka commitId: 0efa8fb0f4c73d92 (org.apache.kafka.common.utils.AppInfoParser:118)\n[2021-03-06 06:47:28,107] INFO Kafka startTimeMs: 1615013248107 (org.apache.kafka.common.utils.AppInfoParser:119)\n[2021-03-06 06:47:28,118] INFO Starting JDBC source task (io.confluent.connect.jdbc.source.JdbcSourceTask:85)\n[2021-03-06 06:47:28,119] INFO Created connector coyote-ca-1615013233026 (org.apache.kafka.connect.cli.ConnectStandalone:112)\n[2021-03-06 06:47:28,121] INFO JdbcSourceTaskConfig values: \n\tbatch.max.rows = 100\n\tcatalog.pattern = null\n\tconnection.attempts = 3\n\tconnection.backoff.ms = 10000\n\tconnection.password = null\n\tconnection.url = jdbc:sqlite:coyote_test.sqlite\n\tconnection.user = null\n\tdb.timezone = UTC\n\tdialect.name = \n\tincrementing.column.name = id\n\tmode = incrementing\n\tnumeric.mapping = null\n\tnumeric.precision.mapping = false\n\tpoll.interval.ms = 5000\n\tquery = \n\tquery.suffix = \n\tquote.sql.identifiers = ALWAYS\n\tschema.pattern = null\n\ttable.blacklist = []\n\ttable.poll.interval.ms = 60000\n\ttable.types = [TABLE]\n\ttable.whitelist = []\n\ttables = [\"accounts\"]\n\ttimestamp.column.name = []\n\ttimestamp.delay.interval.ms = 0\n\ttimestamp.initial = null\n\ttopic.prefix = coyote-ca-\n\tvalidate.non.null = true\n (io.confluent.connect.jdbc.source.JdbcSourceTaskConfig:347)\n[2021-03-06 06:47:28,121] INFO Using JDBC dialect Sqlite (io.confluent.connect.jdbc.source.JdbcSourceTask:102)\n[2021-03-06 06:47:28,126] INFO [Producer clientId=connector-producer-coyote-ca-1615013233026-0] Cluster ID: HfUmm1LuTOSlFvzO39QCXQ (org.apache.kafka.clients.Metadata:277)\n[2021-03-06 06:47:28,159] INFO Attempting to open connection #1 to Sqlite (io.confluent.connect.jdbc.util.CachedConnectionProvider:92)\n[2021-03-06 06:47:28,175] INFO Started JDBC source task (io.confluent.connect.jdbc.source.JdbcSourceTask:261)\n[2021-03-06 06:47:28,175] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:216)\n[2021-03-06 06:47:28,179] INFO Begin using SQL query: SELECT * FROM \"accounts\" WHERE \"accounts\".\"id\" > ? ORDER BY \"accounts\".\"id\" ASC (io.confluent.connect.jdbc.source.TableQuerier:164)\n[2021-03-06 06:47:28,374] WARN [Producer clientId=connector-producer-coyote-ca-1615013233026-0] Error while fetching metadata with correlation id 3 : {coyote-ca-accounts=LEADER_NOT_AVAILABLE} (org.apache.kafka.clients.NetworkClient:1077)\n[2021-03-06 06:47:33,117] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:426)\n[2021-03-06 06:47:33,118] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:443)\n[2021-03-06 06:47:33,133] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Finished commitOffsets successfully in 14 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:525)\n[2021-03-06 06:47:38,134] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:426)\n[2021-03-06 06:47:38,134] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:443)\n[2021-03-06 06:47:43,135] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:426)\n[2021-03-06 06:47:43,136] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:443)\n[2021-03-06 06:47:48,137] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:426)\n[2021-03-06 06:47:48,138] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:443)\n[2021-03-06 06:47:53,139] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:426)\n[2021-03-06 06:47:53,139] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:443)\n[2021-03-06 06:47:58,044] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:67)\n[2021-03-06 06:47:58,046] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:321)\n[2021-03-06 06:47:58,069] INFO Stopped http_38783@2b4954a4{HTTP/1.1,[http/1.1]}{0.0.0.0:38783} (org.eclipse.jetty.server.AbstractConnector:380)\n[2021-03-06 06:47:58,071] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:158)\n[2021-03-06 06:47:58,074] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:338)\n[2021-03-06 06:47:58,075] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:100)\n[2021-03-06 06:47:58,077] INFO Stopping task coyote-ca-1615013233026-0 (org.apache.kafka.connect.runtime.Worker:706)\n[2021-03-06 06:47:58,077] INFO Stopping JDBC source task (io.confluent.connect.jdbc.source.JdbcSourceTask:317)\n[2021-03-06 06:47:58,109] INFO Closing resources for JDBC source task (io.confluent.connect.jdbc.source.JdbcSourceTask:324)\n[2021-03-06 06:47:58,109] INFO Closing connection #1 to Sqlite (io.confluent.connect.jdbc.util.CachedConnectionProvider:118)\n[2021-03-06 06:47:58,110] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:426)\n[2021-03-06 06:47:58,111] INFO WorkerSourceTask{id=coyote-ca-1615013233026-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:443)\n[2021-03-06 06:47:58,111] INFO [Producer clientId=connector-producer-coyote-ca-1615013233026-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1182)\n[2021-03-06 06:47:58,123] INFO Stopping connector coyote-ca-1615013233026 (org.apache.kafka.connect.runtime.Worker:360)\n[2021-03-06 06:47:58,124] INFO Stopping table monitoring thread (io.confluent.connect.jdbc.JdbcSourceConnector:174)\n[2021-03-06 06:47:58,124] INFO Shutting down thread monitoring tables. (io.confluent.connect.jdbc.source.TableMonitorThread:134)\n[2021-03-06 06:47:58,125] INFO Closing connection #1 to Sqlite (io.confluent.connect.jdbc.util.CachedConnectionProvider:118)\n[2021-03-06 06:47:58,128] INFO Stopped connector coyote-ca-1615013233026 (org.apache.kafka.connect.runtime.Worker:376)\n[2021-03-06 06:47:58,129] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:198)\n[2021-03-06 06:47:58,130] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66)\n[2021-03-06 06:47:58,131] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:219)\n[2021-03-06 06:47:58,134] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:117)\n[2021-03-06 06:47:58,134] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:72)\n" 2021/03/06 06:48:02 Success, command 'timeout 10 kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic coyote-ca-accounts --from-beginning --timeout-ms 10000 --max-messages 2 ', test 'Run Console Consumer (basic kafka)'. Stdout: "\x00\x00\x00\x00\t\x02\x02\nalice\n\x00\x00\x00\x00\t\x04\x02\x06bob\n" 2021/03/06 06:48:04 Success, command 'kafka-topics --zookeeper 127.0.0.1:2181 --topic coyote-ca-accounts --delete', test 'Delete Connect Standalone Test Topic (basic kafka)'. Stdout: "Topic coyote-ca-accounts is marked for deletion.\nNote: This will have no impact if delete.topic.enable is not set to true.\n" 2021/03/06 06:48:04 Success, command 'rm -rf coyote_test.sqlite coyote_sqlite_connector.properties coyote_connect_standalone.properties coyote_connect.offset', test ''. Stdout: "" 2021/03/06 06:48:04 no errors