1. Overview

In this article, we will learn about how to create and use stream in Nodejs with examples. Why streams are useful and when to use stream?.

1.1 What is Stream and Why Stream?

The stream is one of the powerful features provided by Nodejs. It used to deal with a large amount of data. For e.g. You are fetching data from a database having 10cr. records when this data came into ram then your node application will be crash due to lack of memory. So to overcome this issue steam came in the picture.

Steam is the solution to handle operations on large data with efficiency.With steam we can manipulate data, transform data without brining all data into memory.

1.2 Types of Stream In Nodejs?

Basically there are 4 types of streams.

  • Readable – Stream which used for Read operations.
  • Writable – Stream which used for Write operations.
  • Duplex – Stream which is used for both Read and Write operations.
  • Transform – Steam which produces output based on input. These streams are duplex types.

Each type of stream is an instance of EventEmitter. As we know its emit some events, in this case, these are the most commonly used streams.

  • Data – This event emitted event when data is available to read.
  • End –  This event emitted when no data available to read or we can treat this as the end of data.
  • Finish – This event emitted when there is no more data to be consumed from the stream.
  • Error – This event emitted when some error occurs when the underlying stream is unable to generate data due to some internal failure.

2 Nodejs Stream Example

2.1 Readable Stream Example

Create a file named datasource.txt And place some text content in it.  I have placed some product data so my datasource.txt file size is now 30 MB  as per the below image. let’s read data from it without bringing the whole file into memory.

ReadStream

Create a file named index.js and put the below code.

const fs = require('fs');
const rr = fs.createReadStream('datasource.txt');

//Called when data is available
rr.on('data', (data) => {
  console.log(`data: ${data}`);
});

//Called when no data available
rr.on('end', () => {    
  console.log('end');
});

// Called when some error occur
rr.on('error', (err) => {
    console.log(err);
})

In the above code, we are reading datasource.txt the file using fs the module. In the fs module, createReadStream the method is available which returns ReadStream Object. Then after this stream will emit data, end, and error events will be emitted in case of any error. That event will handle by the error method written above.

To Run the file:

node index.js

Output:

Read Steam

Read Steam Output

2.2 Write Stream Example

Create a file named index.js in this example, we will put some data in datasource.txt file.

const fs = require('fs');

const data = 'We are javadeveloperzone';

// Create a writable stream
var ws = fs.createWriteStream('datasource.txt');

// called when finish emitted
ws.on('finish', function() {
   console.log("Completed!!");
});

// Called when some error occur
ws.on('error', function(err) {
   console.log(err.stack);
});

// Write the data to stream with encoding as utf8
ws.write(data,'UTF8');

// the end of file
ws.end();

We are opening datasource.txt in write mode and writing data. With the write function, we can write data in the data source file.

To Run the file:

node index.js

Output:

Completed!!

Now you can check datasource.txt and check the content.

2.3 Piping In Stream Example

As the name suggests piping means connecting more than one stream. In Piping, we are providing the output of one stream to the input of another stream. Now let’s read data from datasource1.txt and write it to datasource2.txt

Create a file named index.js and place the below content.

const fs = require('fs');

// Create a readable stream
const rr = fs.createReadStream('datasource1.txt');
 
// Create a writable stream
var ws = fs.createWriteStream('datasource2.txt');

// here we are providing output of readsteam to input of writestream.
rr.pipe(ws);

// called when finish emitted
ws.on('finish', function() {
   console.log("Completed!!");
});

// Called when some error occur
ws.on('error', function(err) {
   console.log(err.stack);
});

In this example, we have created read stream from datasource1.txt and writing data to datasource2.txt. We can get events like finish, error to track the process.

To Run the file: 

node index.js

Output:

Completed!!

Now you can check datasource2.txt and check the content.

2.4 Chaining In Steam Example

By using pipe in the sequence we can create the pipe chaining. Here we are providing the output of the previous pipe as the input of the next pipe. The main purpose of creating pipe chaining is to process data on different levels.

let’s create a file named index.js and place the below content.

const fs = require('fs');
const { Transform } = require('stream');

// Create a readable stream
const rr = fs.createReadStream('datasource1.txt');
 
// Create a writable stream
var ws = fs.createWriteStream('datasource2.txt');

const upperCase = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});

//before writing the output we are trasforming to upper case
rr.pipe(upperCase).pipe(ws);

// called when finish emitted
ws.on('finish', function() {
   console.log("Completed!!");
});

// Called when some error occur
ws.on('error', function(err) {
   console.log(err.stack);
});

Here we are converting the output of the read stream to uppercase before writing to the write stream.

To Run the file: 

node index.js

Output:

Completed!!

3 Conclusion

In this article, we learned how to use the Streams, Stream Piping, and Stream Chaining. In the Nodejs Stream module, there are so many useful methods and classes you can explore that to master the Nodejs stream module.

Reference

Was this post helpful?

Leave a Reply

Your email address will not be published. Required fields are marked *