Parse large JSON file in Nodejs and handle each object independently Parse large JSON file in Nodejs and handle each object independently node.js node.js

Parse large JSON file in Nodejs and handle each object independently


There is a nice module named 'stream-json' that does exactly what you want.

It can parse JSON files far exceeding available memory.

and

StreamArray handles a frequent use case: a huge array of relatively small objects similar to Django-produced database dumps. It streams array components individually taking care of assembling them automatically.

Here is a very basic example:

const StreamArray = require('stream-json/streamers/StreamArray');const path = require('path');const fs = require('fs');const jsonStream = StreamArray.withParser();//You'll get json objects here//Key is an array-index herejsonStream.on('data', ({key, value}) => {    console.log(key, value);});jsonStream.on('end', () => {    console.log('All done');});const filename = path.join(__dirname, 'sample.json');fs.createReadStream(filename).pipe(jsonStream.input);