Kafka is a distributed event streaming platform that has become very popular in the past couple of years.

In a MongoDB-Kafka architecture, MongoDB may be configured as both a Sink and a Source. With sink we mean to ingest events from your Kafka topics directly into MongoDB collections, exposing the data to your services for efficient querying, enrichment, and analytics. With source we mean publish data changes from MongoDB into Kafka topics for streaming to consuming apps.

We are going to cover both scenarios (using MongoDB as Sink and Source), by demonstrating ways to efficiently connect each datastore to the other. At the same time, we will cover use-cases and best practices when using Kafka and MongoDB together.


Related Videos: MongoDB

Converting MongoDB to Percona Server for MongoDB
Moving MongoDB to the Cloud: Strategies and Points To Consider
In and out of the weeds with MongoDB 4.4 Hedged Reads - PLO October 2020
The State of MongoDB, Its Open Source Community, and Where Percona Is Going With It
Percona: a solid and compatible alternative to MongoDB
Kubernetes Operator for Percona Server for MongoDB
MongoDB Atlas vs managed community edition