normalizeInputSource: always pipe stream through a PassThrough stream #17
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
normalizeInputSource: always pipe stream through a PassThrough stream
This is needed to guarantee pausing the stream if it's already flowing, since it will only be processed in a (distant) future iteration of the event loop, and will lose data if already flowing when added to the queue.
This should guarantee maximum compatibility without breaking anything as far as I can tell. I tried checking the stream's state, but couldn't find a way to tell for sure if it will lose data due to early consumption, hence this solution of always pausing.
Pausing with a PassThrough stream gives a buffer and guarantees the stream will be paused. First I tried simply calling the
stream#pause()
method, as was done in a very old commit in archiver (from before the utils repo existed), but that doesn't guarantee the stream will be paused and kept losing data in my tests.A couple ways to reproduce the problem with archiver:
Spawning child processes:
Processing the response of a request:
This fixes node-archiver's issue 364