DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Using Render Log Streams to Log to Papertrail
  • Node.js Http Module to Consume Spring RESTful Web Application
  • While Performing Dependency Selection, I Avoid the Loss Of Sleep From Node.js Libraries' Dangers
  • Setting Up Data Pipelines With Snowflake Dynamic Tables

Trending

  • How to Convert XLS to XLSX in Java
  • Ethical AI in Agile
  • AI's Dilemma: When to Retrain and When to Unlearn?
  • Artificial Intelligence, Real Consequences: Balancing Good vs Evil AI [Infographic]
  1. DZone
  2. Coding
  3. JavaScript
  4. Hands-on With Node.js Streams: Examples and Approach

Hands-on With Node.js Streams: Examples and Approach

By 
Shital Agarwal user avatar
Shital Agarwal
·
Feb. 05, 20 · Tutorial
Likes (5)
Comment
Save
Tweet
Share
8.9K Views

Join the DZone community and get the full member experience.

Join For Free

What Are Streams?

A stream is an abstract interface that lets you perform specific tasks continuously. A stream is an EventEmitter that implements different methods. A user can use streams to perform a variety of tasks, like read, write, and transform functions.

In a Node.js environment, streams are used to work with streaming data. It provides users with an API that helps them in creating the streaming interface. The data here is received in parts and is read in parts as well.

Piping Streams

As the name suggests, piping is a process that ensures data flows without any hindrance. Piping feeds the output of a stream as an input to another stream and helps maintain a smooth workflow. How long will the piping continue depends on what a user feeds into the system.

Here is how you can execute a piping mechanism:

Begin by creating a js file main.js having the following code:

JavaScript
 




x
12


 
1
var fs = require("fs");
2
// Insert a readable stream
3
var readerStream = fs.createReadStream('input.txt');
4
// Insert a writable stream
5
var writerStream = fs.createWriteStream('output.txt');
6
// Pipe the read and write operations
7
// read input.txt and write data to output.txt
8
readerStream.pipe(writerStream);
9
console.log("Program Ended");
10
// After that run the main.js to see the output −
11
$ node main.js
12
// Verify if it is what you had expected.



You can use additional logic to allow switching between modes automatically. Using piping, users can create bidirectional streams that can perform various functions. The piping mechanism stops as soon as the end functions run. Users can bypass the end function by inserting an additional optional config object with end function as a Boolean.

Chaining Streams

A process similar to Piping, Chaining is used to perform multiple tasks sequentially. It is a method that connects the output of a stream to another stream and thereby performs multiple operations together. Here is an example of how it works:

Make a js file named main.js with the following code:

JavaScript
 




xxxxxxxxxx
1


 
1
var fs = require("fs");
2
var zlib = require('zlib');
3
// Compress the file Detox.txt to Detox.txt.gz
4
fs.createReadStream(Detox.txt')
5
   .pipe(zlib.createGzip())
6
   .pipe(fs.createWriteStream(Detox.txt.gz'));
7
console.log("File Compressed.");



After adding the code, run the main.js to see the output:

Shell
 




xxxxxxxxxx
1


 
1
$ node main.js
2
 
              
3
Verify the Outcome.
4
File has successfully been compressed.



You will find that Detox.txt has now been compressed and it has created a file, Detox.txt.gz in the current directory. You can decompress the same file using the following code:

JavaScript
 




xxxxxxxxxx
1


 
1
var fs = require("fs");
2
var zlib = require('zlib');
3
// Decompress the file input.txt.gz to Detox.txt
4
fs.createReadStream(Detox.txt.gz')
5
   .pipe(zlib.createGunzip())
6
   .pipe(fs.createWriteStream(Detox.txt'));
7
console.log("File Decompressed.");



As we did before, we can run our file to see the output: 

Shell
 




xxxxxxxxxx
1


 
1
$ node main.js
2
 
3
Verify the Outcome.
4
File Decompressed.



Types of Streams 

Streams are capable of performing various functions, depending on the category under which they fall. We can divide streams into the following categories:

Readable Streams

 

A readable stream, as the name suggests, allows users to read data. They come in two variants or two different reading modes, Paused and Flowing. All readable streams run in pause mode by default. This means users have to request to get an output from the stream. The flowing mode ensures that data flows continuously.

The fs.createReadStream function is used to create a readable stream, or you can read()  continuously until all of the data finishes reading. To make the stream flow, you will need an additional bit of code. Here is an example of a reading stream:

JavaScript
 




xxxxxxxxxx
1
12


 
1
var fs = require('fs');
2
var readableStream = fs.createReadStream('file.txt');
3
var data = '';
4
var chunk;
5
readableStream.on('readable', function() {
6
    while ((chunk=readableStream.read()) != null) {
7
        data += chunk;
8
    }
9
});
10
readableStream.on('end', function() {
11
console.log(data)
12
});



In the above example, the read() function will read data from the internal buffer and return it to the user. As soon as there is no data to read further, it will terminate the loop and return null.

While analyzing the readable stream, we realize that the data event and the end event are its most important events. The data event gets emitted whenever the stream sends a pile of data to the user. The end event comes into the scene when there is no more data left to be consumed by the stream.

Writable Streams 

Another EventEmitter, writable streams, allow users to write to a chosen destination. We use the write() function to initiate a writable stream. The API here is simpler and prefers the use of methods instead of events. It is very easy for users to learn a writable stream. Here is a basic example of writable streams:

JavaScript
 




xxxxxxxxxx
1


 
1
var fs = require('fs');
2
var readableStream = fs.createReadStream('file1.txt');
3
var writableStream = fs.createWriteStream('file2.txt');
4
 
5
readableStream.setEncoding('utf8');
6
 
7
readableStream.on('data', function(chunk) {
8
    writableStream.write(chunk);
9
});



The above example is a pretty standard. We are using a readable stream to read the inputs, and then the write() stream writes it to the designated destination. You will get a Boolean as soon as the function is successful. If the return is true, the process is complete. In case there is any discrepancy, the function will return false.

Two significant events are usually attached to a writable stream – drain and finish events. The drain event is an indicator that the stream is capable of receiving more data. Whereas the finish event signifies that the underlying system has received the entire data.

Duplex Streams 

The first two streams are good at performing individual functions. With duplex streams, you can perform both of their functions collectively. It’s almost like the child inheriting the genes of both the mother and the father. Mostly, a duplex stream consists of two individual streams, one of which is for flowing in, and the other is for flowing out. Below is an example of a basic duplex stream:

JavaScript
 




xxxxxxxxxx
1


 
1
net.createServer(socket => {
2
    socket.pipe(socket)
3
}).listen(8001);



In the given example, the socket has been piped to itself, which in turn will ensure the creation of a duplex stream. The first socket is a readable stream, whereas the next one is a writable stream. Whenever you run the function, the netcat will try to send some data to the server. On the other hand, the server will try to write the data received.

Transform Streams

A transform stream is a more complex duplex stream, where the user reads what they are sending off as input. Unlike the duplex stream, the reader here has access to the data that he has entered. It also refers to the fact that the output depends on the input provided to the machine. We use the transformStream function to create a transform stream.

Here is a simple transform stream example:

JavaScript
 




xxxxxxxxxx
1


 
1
const readableStream = fs.createReadStream('file');
2
const transformStream = zlib.createGzip();
3
const writableStream = fs.createWriteStream('file.gz');
4
readableStream.pipe(transformStream).pipe(writableStream);



The above transform stream will zip a file when we run it. The zlib function comes into play when we need output that is either much larger or much smaller than the input. In this case, it has been used to create a smaller output. 

Streams Compatibility With Async Generators and Async Iterators

With the help of async generators, we can create a Node.js readable stream. We need to use the Readable.from() function, as given in the example below:

JavaScript
 




xxxxxxxxxx
1
13


 
1
const { Readable } = require('stream');
2
 
3
async function * generate() {
4
  yield 'a';
5
  yield 'b';
6
  yield 'c';
7
}
8
 
9
const readable = Readable.from(generate());
10
 
11
readable.on('data', (chunk) => {
12
  console.log(chunk);
13
})



We can also consume a readable stream with async iterators using the async() function. Here is an example:

JavaScript
 




xxxxxxxxxx
1


 
1
(async function() {
2
  for await (const chunk of readable) {
3
    console.log(chunk);
4
  }
5
})();



These iterators are used to prevent unhandled post-destroy errors by registering a permanent error handler.

We can also pipe writable streams from async iterators, but we must be careful of backpressure and errors.

Benefits of Streams

Being used worldwide, there have to be some benefits attached to streams. Other than the fact that even a beginner can implement them, here are some other benefits of using streams:

Time Efficiency

What is the benefit of a chain? It ensures that the person at the back is traveling along with the person at the front. Regarding the stream environment, due to piping, output data of a stream is transferred as an input of another stream. It ensures the timely processing of massive data due to constant flowing. Piping allows us to process multiple stages at the same time, thereby reducing unnecessary time wastage.

Spatial Efficiency

What do you do when you have a small buffer but a larger input file? You create a stream to display the data as soon as possible to ensure that the buffer remains free for the next lot. Suppose you want to read a file that is around 35 MB in size and display output. But the buffer is limited to 25 MB. What do you do in such a situation?

To avert the crisis, create a readable stream. What will it do? As soon as a part of data is read from your input, you can get the pile on to the buffer, display it, and clear it off to make space for the next lot. It will ensure that data is not being leaked and is fully processed.

Conclusion

Streams are an integral part of Node.js and have helped to simplify code for the developers. With the help of streams, developers can now build a code in far less time than earlier. With so many other environments available that do the same thing, streams are the reason why most people have stayed on Node.js. This article should have given you a fair idea of what streams are and how they operate.

Stream (computing) Node.js Data event

Published at DZone with permission of Shital Agarwal. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Using Render Log Streams to Log to Papertrail
  • Node.js Http Module to Consume Spring RESTful Web Application
  • While Performing Dependency Selection, I Avoid the Loss Of Sleep From Node.js Libraries' Dangers
  • Setting Up Data Pipelines With Snowflake Dynamic Tables

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!