Streams are one of the elemental concepts that are introduced to power the applications based on Node.js. These are the method used to handle the data and to read or write input into output consecutively.
These methods are used to manage reading and writing files, network communications, or any end-to-end data exchange in a well-organized manner.
So, what makes the Streams so unique? Streams read chunks of data into pieces, execute its data without storing it all in memory; instead of following the traditional method in which a program reads a file into memory all at once.
This method makes the streams pretty powerful when handling a large amount of data. For example, if a file’s size is larger than the free space in memory, it is impossible to read the complete file into the memory to execute it. That’s where streams come into action!
Let’s understand the working of streams with an example. Most of us are aware of the ‘streaming’ services such as YouTube, Amazon Prime or Netflix. Instead of allowing us to download the video and audio feed all at once, these services provide the browser to receive the video as continuous chunks flow, allowing the recipients to begin watching or listening almost instantly.
However, that doesn’t mean that the streams only work with big data or media. They also provide users with the strength of ‘composability’ in their codes. Designing with keeping composability in mind means that we are combining the several components in a certain manner to produce the same type of output. Using streams in Node.js makes it feasible to write impressive code pieces by piping data to and from other minor code pieces.
Comparing with other methods used in data handling, Streams provide two major advantages:
Because of the stream’s advantages, many Node.js Built-in modules offer intuitive stream handling capabilities. Some of them are listed below:
The Streams in Node.js are classified into four categories:
Since we have already worked a lot with Node.js, several times we have come across to streams. For example, when we create a web server in Node.js, we request a readable stream and response as a writable stream. We have also used the fs module, which allow us to work with writable and readable file streams. Moreover, Express also uses streams to work together with the client. Every database connection driver that we can work with also uses streams because of TLS stack, TCP sockets, and other connections based on Node.js streams.
We are required to import the stream module to create a readable stream and initialize the base ‘Readable’ object:
Now, we have to implement the _read() method:
We can also implement _read() with the help of the read method:
Since the stream is set, we can easily send data to it:
To create a Writable Stream, we are required to initialize the base ‘Writable’ object:
Now, we have to implement the _write() method:
Now, we can pipe a readable stream in:
To understand the concept of reading data from a readable stream, we can use a writable stream as given below:
The above snippet of code will produce an output, as shown below:
However, there is another direct method available to consume a readable stream. This method uses the readable event as given below:
The above snippet of code will produce an output, as shown below:
We can use the write() method to write data to a writable stream:
We can send a signal to a Writable Stream that we have ended writing using the end() method:
The above snippet of code would create a new file named ‘myfile.txt’ and enter the input text. The output of the same will look as given below: