.. | |||
node_modules/ pump | 3 years ago | ||
.travis.yml | 3 years ago | ||
LICENSE | 3 years ago | ||
README.md | 3 years ago | ||
index.js | 3 years ago | ||
package.json | 3 years ago | ||
test.js | 3 years ago |
Combine an array of streams into a single duplex stream using pump and duplexify. If one of the streams closes/errors all streams in the pipeline will be destroyed.
npm install pumpify
Pass the streams you want to pipe together to pumpify pipeline = pumpify(s1, s2, s3, ...)
. pipeline
is a duplex stream that writes to the first streams and reads from the last one. Streams are piped together using pump so if one of them closes all streams will be destroyed.
var pumpify = require('pumpify') var tar = require('tar-fs') var zlib = require('zlib') var fs = require('fs') var untar = pumpify(zlib.createGunzip(), tar.extract('output-folder')) // you can also pass an array instead // var untar = pumpify([zlib.createGunzip(), tar.extract('output-folder')]) fs.createReadStream('some-gzipped-tarball.tgz').pipe(untar)
If you are pumping object streams together use pipeline = pumpify.obj(s1, s2, ...)
. Call pipeline.destroy()
to destroy the pipeline (including the streams passed to pumpify).
setPipeline(s1, s2, ...)
Similar to duplexify you can also define the pipeline asynchronously using setPipeline(s1, s2, ...)
var untar = pumpify() setTimeout(function() { // will start draining the input now untar.setPipeline(zlib.createGunzip(), tar.extract('output-folder')) }, 1000) fs.createReadStream('some-gzipped-tarball.tgz').pipe(untar)
MIT
pumpify
is part of the mississippi stream utility collection which includes more useful stream modules similar to this one.