What is Apache Kafka? Part 3
Understanding the Apache Kafka Ecosystem
The Confluent Schema Registry helps register data schemas in Apache Kafka and ensure that producers and consumers will be compatible with each other while evolving. It supports the Apache Avro, Protobuf and JSON-schema data formats.
Data schemas define for your data the expected fields, their names, and value types
Without a schema registry, producers and consumers are at the risk of breaking when the data schema changes.
Conduktor is an all-in-one friendly interface and platform to work with the Apache Kafka ecosystem. It allows developers to interact with the entire Kafka ecosystem such as Brokers, Topics, Consumers, Producers, Kafka Connect, Kafka Streams, Confluent Schema Registry, and ksqlDB
ksqlDB is a stream processing database that provides a SQL-like interface to transform Kafka topics and perform common database-like operations such as joins, aggregates, filtering, and other forms of data manipulation on streaming data.
Behind the scenes, the ksqlDB webserver translates the SQL commands into a series of Kafka Streams applications.
We hope that this page allowed you to learn about Apache Kafka and its ecosystem at a high level.
If you would like to start using Apache Kafka, we recommend you simply keep on reading these lessons in the order outlined and let us guide you in your Apache Kafka journey!
Happy learning :)