Large CSV to JSON/Object in Node.js Large CSV to JSON/Object in Node.js node.js node.js

Large CSV to JSON/Object in Node.js


Check node.js csvtojson module which can be used as a library, command line tools, or web server plugin. https://www.npmjs.org/package/csvtojson.the source code can be found at:https://github.com/Keyang/node-csvtojson

or install from NPM repo:

npm install -g csvtojson

It supports any size csv data / field type / nested json etc. A bunch of features.

Example

var Converter=require("csvtojson").core.Converter;var csvConverter=new Converter({constructResult:false, toArrayString:true}); // The constructResult parameter=false will turn off final result construction in memory for stream feature. toArrayString will stream out a normal JSON array object.var readStream=require("fs").createReadStream("inputData.csv"); var writeStream=require("fs").createWriteStream("outpuData.json");readStream.pipe(csvConverter).pipe(writeStream);

You can also use it as a cli tool:

csvtojson myCSVFile.csv


While this is far from a complete answer, you may be able to base your solution on https://github.com/dominictarr/event-stream . Adapted example from the readme:

    var es = require('event-stream')    es.pipeline(                         //connect streams together with `pipe`      process.openStdin(),              //open stdin      es.split(),                       //split stream to break on newlines      es.map(function (data, callback) { //turn this async function into a stream        callback(null          , JSON.stringify(parseCSVLine(data)))  // deal with one line of CSV data      }),       process.stdout      )

After that, I expect you have a bunch of stringified JSON objects on each line.This then needs to be converted to an array, which you may be able to do with and appending , to end of every line, removing it on the last, and then adding [ and ] to beginning and end of the file.

parseCSVLine function must be configured to assign the CSV values to the right object properties. This can be fairly easily done after passing the first line of the file.

I do notice the library is not tested on 0.10 (at least not with Travis), so beware. Maybe run npm test on the source yourself.


I found something more easier way to read csv data using csvtojson.

Here's the code:

var Converter = require("csvtojson").Converter;var converter = new Converter({});converter.fromFile("sample.csv",function(err,result){  var csvData = JSON.stringify  ([    {resultdata : result[0]},    {resultdata : result[1]},    {resultdata : result[2]},    {resultdata : result[3]},    {resultdata : result[4]}  ]);  csvData = JSON.parse(csvData);  console.log(csvData);});

or you can easily do this:

var Converter = require("csvtojson").Converter;var converter = new Converter({});converter.fromFile("sample.csv",function(err,result){   console.log(result);});

Here's the result from the 1st code:

[ { resultdata:      { 'Header 1': 'A_1',       'Header 2': 'B_1',       'Header 3': 'C_1',       'Header 4': 'D_1',       'Header 5': 'E_1' } },  { resultdata:      { 'Header 1': 'A_2',       'Header 2': 'B_2',       'Header 3': 'C_2',       'Header 4': 'D_2',       'Header 5': 'E_2' } },  { resultdata:      { 'Header 1': 'A_3',       'Header 2': 'B_3',       'Header 3': 'C_3',       'Header 4': 'D_3',       'Header 5': 'E_3' } },  { resultdata:      { 'Header 1': 'A_4',       'Header 2': 'B_4',       'Header 3': 'C_4',       'Header 4': 'D_4',       'Header 5': 'E_4' } },  { resultdata:      { 'Header 1': 'A_5',       'Header 2': 'B_5',       'Header 3': 'C_5',       'Header 4': 'D_5',       'Header 5': 'E_5' } } ]

Source of this code is found in:https://www.npmjs.com/package/csvtojson#installation

I hope you got some idea.