Converting a Buffer into a ReadableStream in Node.js

143,047

Solution 1

You can create a ReadableStream using Node Stream Buffers like so:

// Initialize stream
var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
  frequency: 10,      // in milliseconds.
  chunkSize: 2048     // in bytes.
}); 

// With a buffer
myReadableStreamBuffer.put(aBuffer);

// Or with a string
myReadableStreamBuffer.put("A String", "utf8");

The frequency cannot be 0 so this will introduce a certain delay.

Solution 2

For nodejs 10.17.0 and up:

const { Readable } = require('stream');

const stream = Readable.from(myBuffer.toString());

Solution 3

something like this...

import { Readable } from 'stream'

const buffer = new Buffer(img_string, 'base64')
const readable = new Readable()
readable._read = () => {} // _read is required but you can noop it
readable.push(buffer)
readable.push(null)

readable.pipe(consumer) // consume the stream

In the general course, a readable stream's _read function should collect data from the underlying source and push it incrementally ensuring you don't harvest a huge source into memory before it's needed.

In this case though you already have the source in memory, so _read is not required.

Pushing the whole buffer just wraps it in the readable stream api.

Solution 4

Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use.

Gabriel Llamas suggests streamifier in this answer: How to wrap a buffer as a stream2 Readable stream?

Solution 5

You can use the standard NodeJS stream API for this - stream.Readable.from

const { Readable } = require('stream');
const stream = Readable.from(buffer);

Note: Don't convert a buffer to string (buffer.toString()) if the buffer contains binary data. It will lead to corrupted binary files.

Share:
143,047
Masiar
Author by

Masiar

Updated on April 21, 2022

Comments

  • Masiar
    Masiar about 2 years

    I have a library that takes as input a ReadableStream, but my input is just a base64 format image. I could convert the data I have in a Buffer like so:

    var img = new Buffer(img_string, 'base64');
    

    But I have no idea how to convert it to a ReadableStream or convert the Buffer I obtained to a ReadableStream.

    Is there a way to do this?

  • Masiar
    Masiar over 11 years
    Thanks, even though a bit late. I don't remember how I solved the problem, but this looks a nice solution. If anybody confirm this it would be great. I remember finding ZERO about this conversion.
  • Ramesh Prasad
    Ramesh Prasad about 11 years
    Confirming that it works - found this when looking up how to turn filebuffers into streams.
  • vanthome
    vanthome about 11 years
    If you have files you deal with files you should rather open a file read stream straight away with this: nodejs.org/api/fs.html#fs_fs_createreadstream_path_options
  • UpTheCreek
    UpTheCreek over 8 years
    Milliseconds is not a measurement of frequency - I suppose they mean period.
  • vanthome
    vanthome over 8 years
    @UpTheCreek I cannot change it as this is the property name and the unit IS milliseconds.
  • UpTheCreek
    UpTheCreek over 8 years
    @vanthome - I wasn't suggesting it was your fault :) Just poor naming on the part of the node-stream-buffer devs.
  • sepehr
    sepehr about 6 years
    "Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use."Bryan Larsen
  • broofa
    broofa over 5 years
    Wouldn't it be more correct to push() the buffer inside the _read() method? I.e. readable._read = () => {readable.push(buffer); readable.push(null);} . Not sure it matters, but allowing the stream to manage the timing of when data is fed into seems less likely to run into unexpected behavior. Other than that this should be the accepted answer, as it doesn't rely on 3rd party modules.
  • Mr5o1
    Mr5o1 over 5 years
    Generally, you'd be right, but for this specific use case I wouldn't push inside the read method. Conceptually I think _read should be reserved for "harvesting" data from an underlying source. In this case we not only have the data in memory, but no conversion is required. So for wrapping data in a stream this is how I would do it, but for converting or accumulating data in a stream, that logic would happen in the _read method.
  • Yushin Washio
    Yushin Washio over 4 years
    Another equivalent alternative is tostream: const toStream = require('tostream'); toStream(new Buffer ([97, 98, 99])).pipe(process.stdout);
  • Shwetabh Shekhar
    Shwetabh Shekhar over 4 years
    @YushinWashio Definitely. Plenty of modules are available in Node.
  • Franck Freiburger
    Franck Freiburger about 4 years
    Your underlying source is the buffer ;)
  • Mr5o1
    Mr5o1 about 4 years
    @FranckFreiburger Yes, but you're not "harvesting" data from that source, it's already in memory and you're always going to consume it all in one go, you're not pulling it in on demand.
  • Alesso
    Alesso over 3 years
    you're the best ❤️
  • Dmitry Minkovsky
    Dmitry Minkovsky over 3 years
    Yeah this is the best. I don't think .toString() is necessary though.
  • Philipp Claßen
    Philipp Claßen over 3 years
    For the imports: const { Readable } = require('stream')
  • Jithin
    Jithin about 3 years
    does this have any impact on image files?
  • Ihor Sakailiuk
    Ihor Sakailiuk over 2 years
    It will work for the case described by OP, but in case the buffer contains binary data .toString() will corrupt it
  • Robert G. Schaffrath
    Robert G. Schaffrath about 2 years
    I noted that this only works with Node.js 12+. The "readable.push(buffer)" method works with Node.js 8.17.0 and Node.js 10.24.1 in my tests. As I stick to supported Node.js versions, it is a non-issue for me.