Node.js Piping the same readable stream into multiple (writable) targets Node.js Piping the same readable stream into multiple (writable) targets node.js node.js

Node.js Piping the same readable stream into multiple (writable) targets


You have to create duplicate of the stream by piping it to two streams. You can create a simple stream with a PassThrough stream, it simply passes the input to the output.

const spawn = require('child_process').spawn;const PassThrough = require('stream').PassThrough;const a = spawn('echo', ['hi user']);const b = new PassThrough();const c = new PassThrough();a.stdout.pipe(b);a.stdout.pipe(c);let count = 0;b.on('data', function (chunk) {  count += chunk.length;});b.on('end', function () {  console.log(count);  c.pipe(process.stdout);});

Output:

8hi user


The first answer only works if streams take roughly the same amount of time to process data. If one takes significantly longer, the faster one will request new data, consequently overwriting the data still being used by the slower one (I had this problem after trying to solve it using a duplicate stream).

The following pattern worked very well for me. It uses a library based on Stream2 streams, Streamz, and Promises to synchronize async streams via a callback. Using the familiar example from the first answer:

spawn = require('child_process').spawn;pass = require('stream').PassThrough;streamz = require('streamz').PassThrough;var Promise = require('bluebird');a = spawn('echo', ['hi user']);b = new pass;c = new pass;   a.stdout.pipe(streamz(combineStreamOperations)); function combineStreamOperations(data, next){  Promise.join(b, c, function(b, c){ //perform n operations on the same data  next(); //request more}count = 0;b.on('data', function(chunk) { count += chunk.length; });b.on('end', function() { console.log(count); c.pipe(process.stdout); });


You can use this small npm package I created:

readable-stream-clone

With this you can reuse readable streams as many times as you need