Converting a Buffer into a ReadableStream in Node.js
Solution 1
You can create a ReadableStream using Node Stream Buffers like so:
// Initialize stream
var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
frequency: 10, // in milliseconds.
chunkSize: 2048 // in bytes.
});
// With a buffer
myReadableStreamBuffer.put(aBuffer);
// Or with a string
myReadableStreamBuffer.put("A String", "utf8");
The frequency cannot be 0 so this will introduce a certain delay.
Solution 2
For nodejs 10.17.0 and up:
const { Readable } = require('stream');
const stream = Readable.from(myBuffer.toString());
Solution 3
something like this...
import { Readable } from 'stream'
const buffer = new Buffer(img_string, 'base64')
const readable = new Readable()
readable._read = () => {} // _read is required but you can noop it
readable.push(buffer)
readable.push(null)
readable.pipe(consumer) // consume the stream
In the general course, a readable stream's _read
function should collect data from the underlying source and push
it incrementally ensuring you don't harvest a huge source into memory before it's needed.
In this case though you already have the source in memory, so _read
is not required.
Pushing the whole buffer just wraps it in the readable stream api.
Solution 4
Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use.
Gabriel Llamas suggests streamifier in this answer: How to wrap a buffer as a stream2 Readable stream?
Solution 5
You can use the standard NodeJS stream API for this - stream.Readable.from
const { Readable } = require('stream');
const stream = Readable.from(buffer);
Note: Don't convert a buffer to string (
buffer.toString()
) if the buffer contains binary data. It will lead to corrupted binary files.
Masiar
Updated on April 21, 2022Comments
-
Masiar about 2 years
I have a library that takes as input a
ReadableStream
, but my input is just a base64 format image. I could convert the data I have in aBuffer
like so:var img = new Buffer(img_string, 'base64');
But I have no idea how to convert it to a
ReadableStream
or convert theBuffer
I obtained to aReadableStream
.Is there a way to do this?
-
Masiar over 11 yearsThanks, even though a bit late. I don't remember how I solved the problem, but this looks a nice solution. If anybody confirm this it would be great. I remember finding ZERO about this conversion.
-
Ramesh Prasad about 11 yearsConfirming that it works - found this when looking up how to turn filebuffers into streams.
-
vanthome about 11 yearsIf you have files you deal with files you should rather open a file read stream straight away with this: nodejs.org/api/fs.html#fs_fs_createreadstream_path_options
-
UpTheCreek over 8 yearsMilliseconds is not a measurement of frequency - I suppose they mean period.
-
vanthome over 8 years@UpTheCreek I cannot change it as this is the property name and the unit IS milliseconds.
-
UpTheCreek over 8 years@vanthome - I wasn't suggesting it was your fault :) Just poor naming on the part of the node-stream-buffer devs.
-
sepehr about 6 years"Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use." — Bryan Larsen
-
broofa over 5 yearsWouldn't it be more correct to
push()
the buffer inside the_read()
method? I.e.readable._read = () => {readable.push(buffer); readable.push(null);}
. Not sure it matters, but allowing the stream to manage the timing of when data is fed into seems less likely to run into unexpected behavior. Other than that this should be the accepted answer, as it doesn't rely on 3rd party modules. -
Mr5o1 over 5 yearsGenerally, you'd be right, but for this specific use case I wouldn't
push
inside theread
method. Conceptually I think_read
should be reserved for "harvesting" data from an underlying source. In this case we not only have the data in memory, but no conversion is required. So for wrapping data in a stream this is how I would do it, but for converting or accumulating data in a stream, that logic would happen in the_read
method. -
Yushin Washio over 4 yearsAnother equivalent alternative is tostream:
const toStream = require('tostream'); toStream(new Buffer ([97, 98, 99])).pipe(process.stdout);
-
Shwetabh Shekhar over 4 years@YushinWashio Definitely. Plenty of modules are available in Node.
-
Franck Freiburger about 4 yearsYour underlying source is the buffer ;)
-
Mr5o1 about 4 years@FranckFreiburger Yes, but you're not "harvesting" data from that source, it's already in memory and you're always going to consume it all in one go, you're not pulling it in on demand.
-
Alesso over 3 yearsyou're the best ❤️
-
Dmitry Minkovsky over 3 yearsYeah this is the best. I don't think .toString() is necessary though.
-
Philipp Claßen over 3 yearsFor the imports: const { Readable } = require('stream')
-
Jithin about 3 yearsdoes this have any impact on image files?
-
Ihor Sakailiuk over 2 yearsIt will work for the case described by OP, but in case the buffer contains binary data
.toString()
will corrupt it -
Robert G. Schaffrath about 2 yearsI noted that this only works with Node.js 12+. The "readable.push(buffer)" method works with Node.js 8.17.0 and Node.js 10.24.1 in my tests. As I stick to supported Node.js versions, it is a non-issue for me.