I want to get real-time updates about MongoDB database changes in Node.js.
A single MongoDB change stream sends update notifications almost instantly. But when I open multiple (10+) streams, there are massive delays (up to several minutes) between database writes and notification arrival.
That's how I set up a change stream:
let cursor = collection.watch([
{$match: {"fullDocument.room": roomId}},
]);
cursor.stream().on("data", doc => {...});
I tried an alternative way to set up a stream, but it's just as slow:
let cursor = collection.aggregate([
{$changeStream: {}},
{$match: {"fullDocument.room": roomId}},
]);
cursor.forEach(doc => {...});
An automated process inserts tiny documents into the collection while collecting performance data.
Some additional details:
- Open stream cursors count: 50
- Write speed: 100 docs/second (batches of 10 using
insertMany
) - Runtime: 100 seconds
- Average delay: 7.1 seconds
- Largest delay: 205 seconds (not a typo, over three minutes)
- MongoDB version: 3.6.2
- Cluster setup #1: MongoDB Atlas M10 (3 replica set)
- Cluster setup #2: DigitalOcean Ubuntu box + single instance mongo cluster in Docker
- Node.js CPU usage: <1%
Both setups produce the same issue. What could be going on here?
question from:https://stackoverflow.com/questions/48411897/severe-performance-drop-with-mongodb-change-streams