I am trying to implement a stream with the new Node.js streams API that will buffer a certain amount of data. When this stream is piped to another stream, or if something consumes readable
events, this stream should flush its buffer and then simply become pass-through. The catch is, this stream will be piped to many other streams, and when each destination stream is attached, the buffer must be flushed even if it is already flushed to another stream.
For example:
BufferStream
implementsstream.Transform
, and keeps a 512KB internal ring bufferReadableStreamA
is piped to an instance ofBufferStream
BufferStream
writes to its ring buffer, reading data fromReadableStreamA
as it comes in. (It doesn't matter if data is lost, as the buffer overwrites old data.)BufferStream
is piped toWritableStreamB
WritableStreamB
receives the entire 512KB buffer, and continues to get data as it is written fromReadableStreamA
throughBufferStream
.BufferStream
is piped toWritableStreamC
WritableStreamC
also receives the entire 512KB buffer, but this buffer is now different than whatWritableStreamB
received, because more data has since been written toBufferStream
.
Is this possible with the streams API? The only method I can think of would be to create an object with a method that spins up a new PassThrough stream for each destination, meaning I couldn't simply pipe to and from it.
For what it's worth, I've done this with the old "flowing" API by simply listening for new handlers on data
events. When a new function was attached with .on('data')
, I would call it directly with a copy of the ring buffer.