DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
View Events Video Library
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Integrating PostgreSQL Databases with ANF: Join this workshop to learn how to create a PostgreSQL server using Instaclustr’s managed service

Mobile Database Essentials: Assess data needs, storage requirements, and more when leveraging databases for cloud and edge applications.

Monitoring and Observability for LLMs: Datadog and Google Cloud discuss how to achieve optimal AI model performance.

Automated Testing: The latest on architecture, TDD, and the benefits of AI and low-code tools.

Related

  • Tracking Changes in MongoDB With Scala and Akka
  • Reactive Event Streaming Architecture With Kafka, Redis Streams, Spring Boot, and HTTP Server-Sent Events (SSE)
  • Comparing MongoDB and Couchbase in Java Enterprise Architecture
  • Mixing SQL and NoSQL With MariaDB and MongoDB

Trending

  • Agile Metrics and KPIs in Action
  • Parallelism in ConcurrentHashMap
  • REST vs. Message Brokers: Choosing the Right Communication
  • Deploy Like a Pro: Mastering the Best Practices for Code Deployment
  1. DZone
  2. Data Engineering
  3. Databases
  4. Change Streams With MongoDB

Change Streams With MongoDB

Let's check out how to achieve integration with help of Change Streams. Also look at what MongoDB has to do with this.

Akshat Thakar user avatar by
Akshat Thakar
·
Updated Sep. 22, 18 · Tutorial
Like (2)
Save
Tweet
Share
17.47K Views

Join the DZone community and get the full member experience.

Join For Free

MongoDB is always one step ahead of other database solutions in providing user-friendly support with advance features rolled out to ease operations. The OpLog feature was used extensively by MongoDB connectors to pull out data updates and generate stream. OpLog feature banked on MongoDB's internal replication feature. While the feature was highly useful, it was complex and necessarily mean tailing of logs.

To simplify things, Change Streams as subscriber to all insert, update, delete MongoDB collection operations was introduced, which should go well with Node.js event-based architecture. 

Change Streams can be leveraged to integrate data producer and data consumer applications.

Image title

Below are the steps to achieve integration with help of Change Streams.

Configure MongoDB cluster with replica set, start data servers.

mongod --shardsvr --replSet "rs0" --port 27018 --dbpath=D:\mongo\shard0\s0 --logpath=D:\mongo\shard0\log\s0.log --logappend

mongod --shardsvr --replSet "rs0" --port 27019 --dbpath=D:\mongo\shard0\s4 --logpath=D:\mongo\shard0\log\s4.log --logappend

mongod --shardsvr --replSet "rs0" --port 27020 --dbpath=D:\mongo\shard0\s5 --logpath=D:\mongo\shard0\log\s5.log --logappend 

Ensure that  dbpath  and  logpath  exists before you execute the shell command.

Connect to one of the servers launched above and initialize the replica set:

mongo --host localhost --port 27020

rs0:SECONDARY> rs.initiate( {
... _id : "rs0",
... members: [
... { _id: 0, host: "localhost:27018" },
... { _id: 1, host: "localhost:27019" },
... { _id: 2, host: "localhost:27020" }
... ]
... })

Start MongoDB configuration server replication set - 3 servers:

mongod --configsvr --replSet configset --port 27101 --dbpath=D:\mongo\shard\config1 --logpath=D:\mongo\shard\log1\config.log --logappend

mongod --configsvr --replSet configset --port 27102 --dbpath=D:\mongo\shard\config2 --logpath=D:\mongo\shard\log2\config.log --logappend

mongod --configsvr --replSet configset --port 27103 --dbpath=D:\mongo\shard\config3 --logpath=D:\mongo\shard\log3\config.log --logappend

Initialize configuration set by connecting to one of the config servers:

mongo --host localhost --port 27101

configset:PRIMARY> rs.initiate( {
...    _id: "configset",
...    configsvr: true,
...    members: [
...       { _id: 0, host: "localhost:27101" },
...       { _id: 1, host: "localhost:27102" },
...       { _id: 2, host: "localhost:27103" }
...    ]
... } )

Start Query Router server by configuring all configuration servers and add replica set as shard by connecting to Query Router server:

mongos --configdb configset/localhost:27103,localhost:27102,localhost:27101 --port 27030 --logpath=D:\mongo\shard\log\route.log

mongo --host localhost --port 27030

mongos> sh.addShard( "rs0/localhost:27018,localhost:27019,localhost:27020" )

Now with the MongoDB replication in place, use below NodeJS module and code to subscribe to MongoDB database collection updates.

Install mongodb driver with npm install:

npm install mongodb --save

Place the below code block in NodeJS application to subscribe for changes in the form of stream.

const MongoClient = require('mongodb').MongoClient
const pipeline = [
{
$project: { documentKey: false }
}
];
MongoClient.connect("mongodb://localhost:27018,localhost:27019,localhost:27020/?replicaSet=rs0")
.then(client => {
console.log("Connected correctly to server");
// specify db and collections
const db = client.db("test_db");
const collection = db.collection("test_collection");
console.log(MongoClient);

const changeStream = collection.watch(pipeline);
// start listen to changes
changeStream.on("change", function (change) {
console.log( JSON.stringify(change));
});

Conclusion

MongoDB Change Streams is a wonderful addition to MEAN development stack, and I thank the MongoDB team for bringing this feature. Integration between producer front-end applications and consumer analytic applications can be done in a seamless manner by leveraging this feature.

MongoDB Stream (computing)

Opinions expressed by DZone contributors are their own.

Related

  • Tracking Changes in MongoDB With Scala and Akka
  • Reactive Event Streaming Architecture With Kafka, Redis Streams, Spring Boot, and HTTP Server-Sent Events (SSE)
  • Comparing MongoDB and Couchbase in Java Enterprise Architecture
  • Mixing SQL and NoSQL With MariaDB and MongoDB

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: