Kubernetes: Orchestrate Your Node.js (Micro)Services
Get a breakdown of microservices architectures, now they apply to Node.js apps, and how to make the most of them with containers and orchestration.
Join the DZone community and get the full member experience.
Join For FreeNowadays, everybody talks about orchestration, microservices, containers, and how Kubernetes is changing the orchestration world. What does this mean to deploy your Node.js application in production? Why should you care anyways?
This blog post will touch this subject and give you the first steps to join the conversation and apply this learning straight in production. First, let talk more about the whole microservice-oriented architecture and how it applies to Node.js.
Demystifying Node.js Microservices
Instead of writing an in-depth essay about the demystification of microservices, I'll give you a quick summary before we jump into the sweet stuff.
To understand why everybody wants to talk about the cool guy in town (microservices), we need to understand the precursor in application development. The monolith.
The Monolith
If you have a monolithic application, then all of the code for a system is in a single codebase that runs together and produces one point of entry.
Bringing up the application is just a simple npm start
. It keeps everything tidily together. It's still a great option for application development and being agile. The downside of this is when you want to release often, you'll have to redeploy the whole monolith, not just the changed parts.
Microservices
Another approach of developing an application is to break-up your codebase is smaller chunks. Each chunk is a service. All those services together will provide the end-user with the same functionality as the monolith.
Bringing up the application is NOT just a simple npm start
. You need to start each service independently using its own npm start
.
There is no definition how micro your microservice should be. When you break up your monolith into smaller chunks or your application is composed of different services, you are doing microservices.
As for the upside, microservices allow you to release often and to redeploy only the parts of the system that you made changes to. The downside is how to orchestrate this.
I wrote an essay about orchestration in general and quote: Everything is orchestration. What this means is that things applied to microservices, addressed in this blog post, also apply to monoliths.
The Orchestration Problem of a Non-Monolith Approach
If your Node.js application is using a non-monolith approach — multiple services composed into a single application — you need a way to orchestrate this. Containers for the win here! With containers, you can isolate each process (= service = npm start
) and give each process its own event loop. Let talk a bit out Node's event loop.
Under the Hood of Node's Event Loop
All the magic Node gives us is happening in Node's event loop. Node is using the event-driven paradigm to perform all the magic in the event loop.
In computer programming, event-driven programming is a programming paradigm in which the flow of the program is determined by events such as user actions (mouse clicks, key presses), sensor outputs, or messages from other programs/threads. Event-driven programming is the dominant paradigm used in graphical user interfaces and other applications (like Node) that are centered on performing certain actions in response to (user) input.
The general implementation is to have a central mechanism that listens for events and calls a callback function once an event has been detected. That's the basic principle behind Node's event loop.
Here's some pseudo-code to demonstrate how Node's event loop looks like:
while (queue.waitForMessage()) {
queue.processNextMessage();
}
Node's event loop is a single-threaded event loop. This can cause performance issues. For example, keep an eye out when you are under load or have a workload that does a lot of CPU intensive-processing. How are we making our Node application more performant? Microservices and containers to the rescue!
Containerize Each Service
The first step to running your microservices in production is to containerize each service. We already wrote a bunch of articles about this.
Take this Express.JS as an example:
// call the packages we need
var express = require('express'); // call express
var app = express(); // define our app using express
var bodyParser = require('body-parser');
// this will let us get the data from a POST
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
var port = process.env.PORT || 8080; // set our port
// routes our api
var router = express.Router(); // get an instance of the express Router
router.get('/', function(req, res) {
res.json({ message: 'hooray! welcome to our api!' });
});
// all of our routes will be prefixed with /api
app.use('/api', router);
// start the API server
app.listen(port);
console.log('API runs on port ' + port);
To run this in an isolated process, we need a Dockerfile to containerize this puppy:
FROM node:6.7.0
ENV APP_HOME /app
RUN mkdir $APP_HOME
WORKDIR $APP_HOME
ADD . $APP_HOME
ENV NODE_ENV production
ENV NPM_CONFIG_LOGLEVEL war
RUN npm install
CMD npm start
Now you can build a container image: $ docker build -t my_micro_service .
and run it as an isolated process: $ docker run -p 8080:8080 my_micro_service
. You can now hit http://localhost:8080.
What I really like about containers is the way you can scale your service and your Node's event loop just by starting more containers and orchestrating them around your cluster. If, for example, parts of your API are under heavy load, you can scale up that piece and utilize the resource more efficiently. If the pressure is normal, you can just scale down.
The Solution
As we talked before, the nice about breaking away from a monolith (= single thread) into a (micro) service-oriented architecture is that you can have an isolated event loop for each service. Because every Node process is running isolated in a container!
When you want to succeed with Node.js in this new chapter in IT, you need to streamline the following four phases:
- Coding your service and commit changes (obvious).
- Building (= container images) and testing your service.
- Provisioning your infrastructure (= your cluster).
- Deploying, running and maintaining your service in production.
What Does Kubernetes Bring to the Table?
Kubernetes is an orchestrator. It takes all the pain away of running your workload on top of your cluster of servers. Kubernetes make sure your services are always up and running, scaling, and does failure-recovery when a server catches fire.
Published at DZone with permission of , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments