Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document how to stream JSON arrays #96

Merged
merged 3 commits into from
Aug 12, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@
"@types/node": "^20.2.4",
"ava": "^5.3.0",
"precise-now": "^2.0.0",
"stream-json": "^1.8.0",
"tsd": "^0.28.1",
"xo": "^0.54.2"
}
Expand Down
17 changes: 17 additions & 0 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,23 @@ const stream = fs.createReadStream('unicorn.txt');
console.log(new Blob([await getStreamAsArrayBuffer(stream)]));
```

### JSON streaming

[`getStreamAsArray()`](#getstreamasarraystream-options) can be combined with JSON streaming utilities to parse JSON incrementally.

```js
import fs from 'node:fs';
import {compose as composeStreams} from 'node:stream';
import {getStreamAsArray} from 'get-stream';
import streamJson from 'stream-json';
import streamJsonArray from 'stream-json/streamers/StreamArray.js';

const stream = fs.createReadStream('big-array-of-objects.json');
console.log(await getStreamAsArray(
composeStreams(stream, streamJson.parser(), streamJsonArray.streamArray()),
));
```

## Benchmarks

### Node.js stream (100 MB, binary)
Expand Down
17 changes: 16 additions & 1 deletion test.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,11 @@ import {spawn} from 'node:child_process';
import {createReadStream} from 'node:fs';
import {open, opendir} from 'node:fs/promises';
import {version as nodeVersion} from 'node:process';
import {Duplex} from 'node:stream';
import {Duplex, compose} from 'node:stream';
import {text, buffer, arrayBuffer, blob} from 'node:stream/consumers';
import test from 'ava';
import streamJson from 'stream-json';
import streamJsonArray from 'stream-json/streamers/StreamArray.js';
import getStream, {getStreamAsBuffer, getStreamAsArrayBuffer, getStreamAsArray, MaxBufferError} from './index.js';

const fixtureString = 'unicorn\n';
Expand Down Expand Up @@ -453,3 +455,16 @@ test.serial('getStreamAsArray() behaves like readable.toArray()', async t => {
]);
t.deepEqual(nativeResult, customResult);
});

test.serial('getStreamAsArray() can stream JSON', async t => {
t.timeout(BIG_TEST_DURATION);
const bigJson = bigArray.map(byte => ({byte}));
const bigJsonString = JSON.stringify(bigJson);
const result = await getStreamAsArray(compose(
createStream([bigJsonString]),
streamJson.parser(),
streamJsonArray.streamArray(),
));
t.is(result.length, bigJson.length);
t.deepEqual(result.at(-1).value, bigJson.at(-1));
});