Streams - Express.js Guide: The Comprehensive Book on Express.js (2014)

Express.js Guide: The Comprehensive Book on Express.js (2014)


The Express.js request and response objects are readable and writable Node.js streams respectably. For those vaguely familiar with streams, they’re powerful tools for processing chunks of data before a particular process (reading, receiving, writing, sending) actually ends. This makes streams useful when dealing with huge data such as audio or video. Another case for streams is when performing big database migrations.



For more information on how to use streams, there are amazing resources by substack: stream-handbook and stream-adventure.

Here is an example of piping a stream to a normal response from expressjsguide/streams-http-res.js:

1 var http = require('http');

2 var fs = require('fs');

3 var server = http.createServer(function (req, res) {

4 fs.createReadStream('users.csv').pipe(res);

5 });

6 server.listen(3000);

The GET request with CURL from the terminal looks like this:

1 $ curl http://localhost:3000

The line above will cause the server to output the content of the file users.csv, e.g.,

1 ...

2 Stanton Botsford,,619

3 Dolly Feeney,,670

4 Oma Beahan,,250

5 Darrion Johnson,,255

6 Garth Huels V,,835

7 Chet Hills II,,951

8 Margarette Littel,,781

9 Alexandrine Schiller,,779

10 Concepcion Emmerich,,518

11 Mrs. Johnpaul Brown,,315

12 Aniyah Barrows,,193

13 Okey Kohler PhD,,831

14 Bill Murray,,962

15 Allen O'Reilly,,448

16 Ms. Bud Hoeger,,997

17 Kathryn Hettinger,,566

18 ...

The result of running stream response from the users.csv file.

If you want to create your own test file such as users.csv, you can install Faker.js (GitHub) and rerun seed-users.js file:

1 $ npm install Faker.js

2 $ node seed-users.js

The Express.js implementation is strikingly similar expressjsguide/stream-express-res.js:

1 var fs = require('fs');

2 var express = require('express');


4 var app = express();


6 app.get('*', function (req, res) {

7 fs.createReadStream('users.csv').pipe(res);

8 });


10 app.listen(3000);

Keeping in mind that the request is a readable stream, and the response is a writable one, we can implement a server that saves POST requests into a file. Here is the content of expressjsguide/stream-http-req.js:

1 var http = require('http');

2 var fs = require('fs');

3 var server = http.createServer(function (req, res) {

4 if (req.method === 'POST') {

5 req.pipe(fs.createWriteStream('ips.txt'));

6 }

7 res.end('\n');

8 });

9 server.listen(3000);

We call Faker.js to generate test data consisting of names, domains, IP addresses, latitudes and longitudes. This time, we won’t save the data to a file, but we’ll pipe it to CURL instead.

Here is the bit of Faker.js script that outputs a JSON object of 1,000 records to the stdout from expressjsguide/seed-ips.js:

1 var Faker = require('Faker');

2 var body = [];


4 for (var i = 0; i < 1000; i++) {

5 body.push({

6 'name': Faker.Name.findName(),

7 'domain': Faker.Internet.domainName(),

8 'ip': Faker.Internet.ip(),

9 'latitude': Faker.Address.latitude(),

10 'longitude': Faker.Address.longitude()

11 });

12 }

13 process.stdout.write(JSON.stringify(body));

To test our stream-http-req.js, let’s run node seed-ips.js | curl -d@- http://localhost:3000.

The beginning of the file written by the Node.js server.

Once more, let’s convert this example into an Express.js app in expressjsguide/stream-express-req.js:

1 var http = require('http');

2 var express = require('express');

3 var app = express();


5'*', function (req, res) {

6 req.pipe(fs.createWriteStream('ips.txt'));

7 res.end('\n');

8 });

9 app.listen(3000);



In some cases, it’s good to have a pass-through logic that doesn’t consume too many resources. For that, check out module through (GitHub). Another useful module is concat-stream (GitHub). It allows concatenation of streams.