Kafka Development Environment
docker container powered by Lenses.io



Kafka Topics UI
browse topics and data


Kafka connect UI
setup & manage connectors


management and monitoring
RUNNING SERVICES fast-data-dev



Services Ports
{{i.port}} : {{i.name}}
Coyote Health Checks

Your set up is being verified using the Coyote integration testing tool and generates working examples.


Passed: {{coyoteResults.passed}} | Failed: {{coyoteResults.failed}}

View Results

Find out more about Coyote awesome tool here.

Fast Data News
  • Kafka 2.5.1

    Fast Data Dev with Apache Kafka 2.5.1 and Confluent components 5.5.1 is now available.

  • Kafka 2.4.1

    Fast Data Dev with Apache Kafka 2.4.1 and Confluent components 5.4.2 is now available.

  • Kafka 2.3.2-pre.200523

    You may have noticed a lack of updates for Kafka. Unfortunately Kafka 2.3.1 has a serious bug, the log cleaner does not work and compacted topics just keep growing. Although the fix has been in for many months, the upstream hasn't tagged 2.3.2 yet. As such we had to make a snapshot release based on the state of the 2.3 branch at 23rd May 2020.

  • Landoop is now Lenses.io

    Our name may have changed, but our commitment to developers has not. We still pledge to make the best tools and assist developers on their path to Streaming Data and DataOps.

    You can find fast-data-dev docker images as both lensesio/fast-data-dev and landoop/fast-data-dev. Also have a look at our new website, lenses.io! You might also want to give lensesio/box a look, fast-data-dev with our Lenses software on top. Claim your free license here.

  • Kafka 2.2 takes the latest tag

    The latest tag will now bring Kafka 2.2.1.

    If you are eager to test Kafka 2.3.0, you may use the 2.3 and 2.3.0 tags.

  • Kafka 2.0 and Stream Reactor 1.2.0 take the latest tag

    The latest tag will now bring Kafka 2.0.1. It will also bring the latest Stream Reactor release, 1.2.0, which includes enhancements, fixes and a brand new connector we know you will love: the Hive connector.

    If you are eager to test Kafka 2.1.0, you may use the 2.1 and 2.1.0 tags.

  • Kafka 1.1 earns the latest tag

    Kafka 1.1 was only available as a tag until now. With the latest 1.1.1 release, we have enough confidence to promote it to the master branch and latest tag; the default Kafka version for fast-data-dev.

  • Fast Data Dev: Total Revamp

    This is the largest update we've done to fast-data-dev since its inception. We are very excited and also curious to see how the new features will be used.

    First of all we updated to Kafka 1.0 but with a twist. We replaced the Confluent OSS distribution with our very own Kafka distribution. We build everything (Kafka, Connect, Schema Registry, 3rd party connectors) from source without any changes. We do alter the file hiercharchy and the startup scripts, to make them work better, be more clean and offer a few new options, like the option to start a vanilla Kafka + Connect without avro support, just like the release you would download from the Apache Kafka site.

    This change will help us provide better support, update faster and add new features. Anyone interested in LKD (Landoop’s Kafka Distribution), can find more information in the documentation. Please remember that we are commited to Apache Kafka and we support all vendors —or no vendors.

    The internals of fast-data-dev also saw a major refactor. For users the main difference will be that everything starts much faster and more predictable, the RAM requirement is lower (you can get away with just 3GB or less) and now you can fine tune any option for Kafka, Connect and the rest of the components via environment variables. Due to this more advanced setups are now supported as well. At last we added new connectors from Couchbase, Debezium and DbVisit and made possible to disable services.

    With big changes sometimes come big new bugs. If you find one, please report it so we can squash it!

    For the developers who have forked fast-data-dev this will probably be a breaking change as too many things changed in the image. On the other hand, the code is much more clean, more robust, easier to read, easier to change and easier to add new features or services. We do not plan more such big changes anytime soon, so once you pick up the new codebase you can be sure it will remain stable.

  • Stream Reactor 0.4.0 and 1.0.0

    Landoop Stream Reactor hit versions 0.4.0 (targeting Kafka 0.11.0) and 1.0.0 (targeting Kafka 1.0). This is our biggest release ever with 25 connectors in total. Find here our release notes and builds.

    An industry first in this release is our Apache Kafka - Apache Pulsar connector! You can try the sink today —the source is work in progress. As a leading streaming company, we, at Landoop, are commited to help you make the most out of your streams.

  • Schema Registry UI and Kafka Connect UI reach 0.9.4

    A twin release by our frontend team! Schema Registry UI now supports schema deletion. Kafka Connect UI adds new connectors and even better support for Kafka 0.11.0 and 1.0. Find out more in the Schema Registry UI 0.9.4 release notes and the Kafka Connect UI 0.9.4 release notes.

  • Datamountaineer and Stream Reactor Join Landoop!

    We are happy to announce that Datamountaineer became part of Landoop.

    We are excited to work together to create new products that will empower Kafka developers and administrators. A platform is its users; Landoop’s mission is to help all Kafka users get the most out of it.

    Datamountaineer are authors of Stream Reactor, the largest collection of open source Kafka connectors. We are commited to keep it that way and further improve it.

    Kafka Connect UI 0.9.3 and Kafka Upgrade

    The newer Kafka Connect UI adds author and documentation support for connectors. Also this is the first tagged release where you can name your clusters when using our docker image. Kafka is a bugfix release and a recommended upgrade.

  • Kafka Topics UI 0.9.3 Upgrade

    The new version solves a couple bugs that will be most visible in the Reddit posts sample topic. If you use the Kafka Topics UI docker image, new configuration options are available as well as access logs and better support for k8s and rancher.

  • New JSON Dataset

    We added a small subset of BackBlaze's SMART dataset for Q1 2017. For this dataset we went with JSON as all our datasets up till now were in avro. As usual, you can set the environment variable RUNNING_SAMPLEDATA=1 and these data will keep coming (cycling over the same dataset).

  • Stream Reactor 0.3.0 for Kafka 0.11.0

    We are proud for this release as well as for the 0.2.6 sister release which targets Kafka 0.10.2. There are new connectors, like the FTP source connector, the ElasticSearch 5 sink conector and the MQTT sink connector to complement the MQTT source whilst the existing connectors got even better with improvements, fixes and KCQL 2. Some configuration options were renamed for consistency, so check the docs and our kafka connector tests repository.

  • Confluent Platform OSS 3.3.0

    We updated to CP 3.3.0 and Kafka It is an exciting release with new features such as exactly-once delivery and transactional messaging. A very nice (and needed) feature is plugin.path for Connect, which permits libraries isolation between connectors. It comes though with a price: Connect needs a few minutes in order to load all the libraries and start up.

  • New Datasets and Running Data

    We added new data for you to play with: reddit posts and NYC yellow taxis trip data. Even better, if you set the environment variable RUNNING_SAMPLEDATA=1 these data will keep coming (cycling over the same datasets) at variable, low rates. We set the retention size to 25MB per partition, so your docker disk won't get full. Enjoy!

  • Confluent Platform OSS 3.2.2

    We updated to CP 3.2.2 and Kafka

  • Kafka Bash Completion 0.1

    We added our new project, kafka-autocomplete to fast-data-dev. What does this mean? Open a bash terminal inside the container:
    $ docker exec -it [ID|NAME] bash

    And try something like:
    $ kafka-topics --config <TAB><TAB>
    $ kafka-console-consumer -- <TAB><TAB>
    $ kafka-topics --zookeeper localhost --topic <TAB><TAB>

    If you like it, don't hesitate to drop us some stars! :)

  • Kafka Connect UI upgraded to 0.9.2

    Listening to your feedback, we improved the real time validation engine to better suit your workflow. Thank you for helping us adjust the UIs to real needs.

  • Kafka Topics UI upgraded to 0.9.1

    This release switched to V2 APIs of Kafka REST so going forward we will only support CP3.2.x or greater. What you get in return? Seeking in messages, partition view and more. Many smaller improvements are also included.

  • Schema Registry UI upgraded to 0.9.1

    Bug fixes for Firefox and Safari and other enhancements.

  • Kafka Connect UI upgraded to 0.9.1

    The upgraded version has many improvements, visual but also under the hood and fixes. Most important though, it supports Connect This means fast-data-dev:cp3.2 will soon get out of beta.

  • Logs accessible from browser

    We made the various logs available through the web interface. You may find them here.

  • Delete Topics Enabled

    Now you can delete Kafka topics. Actually you should have been able from day one but somehow it eluded us. Thanks @simplesteph for bringing to attention and providing a PR.

  • Stream Reactor 0.2.5

    We updated Stream Reactor to 0.2.5. This release contains many fixes, new connectors and support for Kafka

  • Sample Data

    We added a topic with sample AIS messages (sea vessels position reports) which you can use to play. It is named position-reports and you can find it here, about one minute after fast-data-dev is spawned. We plan to add more topics for you to experiment with.

  • Kafka Topics UI upgraded to 0.8.3

    We upgraded Kafka Topics UI to 0.8.3 which contains the new table view, as well as bug fixes, better handling of binary key types and other enhancements.

  • Web server fixes for UIs

    We got a couple reports about the UIs not working properly on some OSes (windows, fedora) and certain setups. We made a change to the webserver (Caddy) configuration and instead of accessing the REST endpoints at localhost, it will try to access them at Please if this setting causes you any trouble, don't hesitate to open an issue. Thanks @jischebeck for helping on this.

  • SSL support hit master

    We added support for a SSL port for the broker that requires authentication. Upon startup, fast-data-dev will create a CA and key-certificate pairs for kafka and clients. Learn how easy it is to enable and use this feature!

  • JMX support added

    We now enable JMX by default for the Kafka components and make it available at ports 9581-9584. You can optionally disable it via -e DISABLE_JMX=1.

  • Kafka does not run as root anymore

    This is important! Fast Data Dev’s behaviour changed. Kafka is now running by default as user nobody instead of the superuser (root). If you need the old behaviour use -e RUN_AS_ROOT=1 on your docker command line.

Learn more about fast-data-dev from our README.

Report Issues & Stars!
Issue Star

powered by Lenses.io