How to wrap a buffer as a stream2 Readable stream?
51,382
Solution 1
The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream.
var stream = require('stream');
// Initiate the source
var bufferStream = new stream.PassThrough();
// Write your buffer
bufferStream.end(Buffer.from('Test data.'));
// Pipe it to something else (i.e. stdout)
bufferStream.pipe(process.stdout)
Solution 2
As natevw suggested, it's even more idiomatic to use a stream.PassThrough
, and end
it with the buffer:
var buffer = new Buffer( 'foo' );
var bufferStream = new stream.PassThrough();
bufferStream.end( buffer );
bufferStream.pipe( process.stdout );
This is also how buffers are converted/piped in vinyl-fs.
Solution 3
A modern simple approach that is usable everywhere you would use fs.createReadStream() but without having to first write the file to a path.
const {Duplex} = require('stream'); // Native Node Module
function bufferToStream(myBuuffer) {
let tmp = new Duplex();
tmp.push(myBuuffer);
tmp.push(null);
return tmp;
}
const myReadableStream = bufferToStream(your_buffer);
- myReadableStream is re-usable.
- The buffer and the stream exist only in memory without writing to local storage.
- I use this approach often when the actual file is stored at some cloud service and our API acts as a go-between. Files never get wrote to a local file.
- I have found this to be the very reliable no matter the buffer (up to 10 mb) or the destination that accepts a Readable Stream. Larger files should implement
Author by
Jerome WAGNER
Updated on July 05, 2022Comments
-
Jerome WAGNER almost 2 years
How can I transform a node.js buffer into a Readable stream following using the stream2 interface ?
I already found this answer and the stream-buffers module but this module is based on the stream1 interface.
-
natevw about 10 yearsUnless node.js does so internally, this solution doesn't slice up the buffer into smaller chunks and so might not be ideal for some pipe destinations. But if you look, neither does the streamifier library from the accepted answer. So +1 for keeping it simple.
-
natevw about 10 yearsI do wonder if using
var bufferStream = stream.PassThrough();
might make the intent clearer to later readers of the code, though? -
natevw about 10 yearsAlso, note that if your destination expects the stream to finish at some point you'll likely need to call
bufferStream.end()
. -
Gabriel Llamas about 10 years@natevw There's no need to slice the buffer because the internal code of streams2 takes care of it (search "fromList", here). Actually, if you slice the buffer, the performance will be worse because if the stream needs to read more bytes than the buffer length, then if you slice it, streams2 will concat them again (here).
-
Startec about 9 yearsWhy would you
end
with the entire buffer? And why doesend
come afterpipe
here -
morris4 about 9 years
end( buffer )
is justwrite( buffer )
and thenend()
. I end the stream because it is not needed anymore. The order of end/pipe does not matter here, because PassThrough only starts emitting data when there's some handler for data events, like a pipe. -
binki over 7 yearsThis requires two steps while
streamifier
only requires one. -
binki over 7 years@Startec Not slicing up the buffer means less overhead. If your consumer cannot handle large chunks, then guard it with something that splits chunks.
-
Admin almost 7 yearsHow would one test or set the chunk size?
-
Shaik Syed Ali about 5 yearsI am not able to call events on the bufferStream object. For ex, bufferStream .on('readable', function () { var chunk; while (null !== (chunk = bufferStream .read())) { //Do something } }) .on('end', function () { //Do something });
-
Boaz almost 4 yearsNote that using the
Buffer
constructor has been deprecated. Use theBuffer.from('Test data.')
method instead.