NodeJS

Building streams in NodeJS


In order to make the code in the controller to be clean, readable, and easy to follow as your application is built with NestJS, the above guidelines can be put into consideration. This approach is especially useful when collaborating with multiple developers. The following best practices can also be used to improve your controller code as you suggested in this example.

 

Make proper use of decorators

Node.js streams are powerful tools for handling data. It helps process data in a much more efficient way, especially for large data sets, by breaking the data into small pieces and processing them one piece at a time. Streams are also useful in handling files, network requests, and other I/O tasks in a memory-efficient way.

What are Streams in NodeJS?

Streams are objects in NodeJS, and they enable reading from a source or writing to a destination continuously. Unlike loading the whole dataset into memory, streams process it incrementally, making them efficient in handling large amounts of data.

 

Types of Streams

NodeJS provides four main types of streams:

  1. Readable Streams: Used to read data.

    • Example: Reading from a file or an HTTP response.

  2. Writable Streams: Used to write data.

    • Example: Writing to a file or sending an HTTP request.

  3. Duplex Streams: Used for both reading and writing.

    • Example: Sockets.

  4. Transform Streams: A type of Duplex stream that modifies or transforms the data as it is read or written.

    • Example: Compression or encryption.

Building Streams in NodeJS

The stream module in NodeJS allows us to create custom streams.

 

Readable Stream

const fs = require('fs'); // Reading data from a file const readable = fs.createReadStream('sample.txt',{ encoding: 'utf-8' }); readable.on('data', (chunkData) => {  console.log('Chunk received:', chunkData); }); readableStream.on('end', () => {   console.log('No more data.'); });

 

Writable Stream

const fs = require('fs'); //Writing data to a file const writableStream = fs.createWriteStream('output.txt'); writableStream.write('Hello, '); writableStream.write('world!'); writableStream.end(' Goodbye.'); writableStream.on('finish', () => {   console.log('Data written successfully.'); });

 

Duplex Stream

const { Duplex } = require('stream'); const duplexStream = new Duplex({ read(size) {    this.push('Data from readable side');    this.push(null); // No more data  },   write(chunk, encoding, callback) {    console.log('Writable received:', chunk.toString());    callback();   }, }); duplexStream.write('Hello Duplex!'); duplexStream.on('data', (chunk) => console.log(chunk.toString()));

 

Transform Stream

const { Transform } = require('stream');// Transform stream to convert input to uppercaseconst transformStream = new Transform({  transform(chunk, encoding, callback) {    this.push(chunk.toString().toUpperCase());    callback();  },});transformStream.write('hello ');transformStream.write('world!');transformStream.end();transformStream.on('data', (chunk) => console.log(chunk.toString()));

Error Handling in Streams

const fs = require('fs');const readable = fs.createReadStream('invalid.txt');readable.on('error', (err) => {  console.error(`Error occurred: ${err.message}`);}); 

When to Use Streams

  • Handling large files.
  • Streaming media content.
  • Real-time data processing (e.g., chat applications).
  • Building pipelines (e.g., data transformation).
  •  

Conclusion

Streams in NodeJS are a highly efficient way of handling data. Through readable, writable, duplex, and transform streams, you can design scalable applications capable of processing data in real time. Built-in and custom streams in NodeJS empower developers to manage data efficiently and write powerful applications.

Go ahead and experiment with streams in your projects!

 

Ready to transform your business with our technology solution? Contact Us today to Leverage Our NodeJS Expertise.

0

NodeJS

Related Center Of Excellence