nodejs synchronization read large file line by line? nodejs synchronization read large file line by line? node.js node.js

nodejs synchronization read large file line by line?


For large files, readFileSync can be inconvenient, as it loads the whole file in memory. A different synchronous approach is to iteratively call readSync, reading small bits of data at a time, and processing the lines as they come. The following bit of code implements this approach and synchronously processes one line at a time from the file 'test.txt':

var fs = require('fs');var filename = 'test.txt'var fd = fs.openSync(filename, 'r');var bufferSize = 1024;var buffer = new Buffer(bufferSize);var leftOver = '';var read, line, idxStart, idx;while ((read = fs.readSync(fd, buffer, 0, bufferSize, null)) !== 0) {  leftOver += buffer.toString('utf8', 0, read);  idxStart = 0  while ((idx = leftOver.indexOf("\n", idxStart)) !== -1) {    line = leftOver.substring(idxStart, idx);    console.log("one line read: " + line);    idxStart = idx + 1;  }  leftOver = leftOver.substring(idxStart);}


use https://github.com/nacholibre/node-readlines

var lineByLine = require('n-readlines');var liner = new lineByLine('./textFile.txt');var line;var lineNumber = 0;while (line = liner.next()) {    console.log('Line ' + lineNumber + ': ' + line.toString('ascii'));    lineNumber++;}console.log('end of line reached');


Use readFileSync:

fs.readFileSync(filename, [encoding]) Synchronous version of fs.readFile. Returns the contents of the filename.

If encoding is specified then this function returns a string. Otherwise it returns a buffer.

On a side note, since you are using node, I'd recommend using asynchronous functions.