fs.readFileSync seems faster than fs.readFile - is it OK to use for a web app in production? fs.readFileSync seems faster than fs.readFile - is it OK to use for a web app in production? node.js node.js

fs.readFileSync seems faster than fs.readFile - is it OK to use for a web app in production?


No, it is not OK to use a blocking API call in a node server as you describe. Your site's responsiveness to many concurrent connections will take a huge hit. It's also just blatantly violating the #1 principle of node.

The key to node working is that while it is waiting on IO, it is doing CPU/memory processing at the same time. This requires asynchronous calls exclusively. So if you have 100 clients reading 100 JSON files, node can ask the OS to read those 100 files but while waiting for the OS to return the file data when it is available, node can be processing other aspects of those 100 network requests. If you have a single synchronous call in there, ALL of your client processing stops entirely while that operation completes. So client number 100's connection waits with no processing whatsoever while you read files for client 1, 2, 3 , 4 and so on sequentially. This is Failville.

Here's another analogy. If you went to a restaurant and were the only customer, you would probably get faster service if a single person sat you, took your order, cooked it, served it to you, and handled the bill without the coordination overhead of dealing with host/hostess, server, head chef, line cooks, cashiers, etc. However, with 100 customers in the restaurant, the extra coordination means things happen in parallel and overall responsiveness of the restaurant is increased way beyond what it would be if a single person was trying to handle 100 customers on their own.


You are blocking the callback of the asynchronous read with your synchronous read, remember single thread. Now I understand that the time difference is still amazing, but you should try with a file that is much, much longer to read and imagine that many, many clients will do the same, only then the overhead will pay off.That should answer your question, yes you will run into trouble if you are serving thousandsof requests with blocking IO.


After a lot of time and a lot of learn & practice I've tried once more and I've found the answer and I can show some example:

const fs = require('fs');const syncTest = () => {    let startTime = +new Date();    const results = [];    const files = [];    for (let i=0, len=4; i<len; i++) {        files.push(fs.readFileSync(`file-${i}.txt`));    };    for (let i=0, len=360; i<len; i++) results.push(Math.sin(i), Math.cos(i));    console.log(`Sync version: ${+new Date() - startTime}`);};const asyncTest = () => {    let startTime = +new Date();    const results = [];    const files = [];    for (let i=0, len=4; i<len; i++) {        fs.readFile(`file-${i}.txt`, file => files.push(file));    };    for (let i=0, len=360; i<len; i++) results.push(Math.sin(i), Math.cos(i));    console.log(`Async version: ${+new Date() - startTime}`);};syncTest();asyncTest();