JavaScript, Node.js: is Array.forEach asynchronous? JavaScript, Node.js: is Array.forEach asynchronous? arrays arrays

JavaScript, Node.js: is Array.forEach asynchronous?


No, it is blocking. Have a look at the specification of the algorithm.

However a maybe easier to understand implementation is given on MDN:

if (!Array.prototype.forEach){  Array.prototype.forEach = function(fun /*, thisp */)  {    "use strict";    if (this === void 0 || this === null)      throw new TypeError();    var t = Object(this);    var len = t.length >>> 0;    if (typeof fun !== "function")      throw new TypeError();    var thisp = arguments[1];    for (var i = 0; i < len; i++)    {      if (i in t)        fun.call(thisp, t[i], i, t);    }  };}

If you have to execute a lot of code for each element, you should consider to use a different approach:

function processArray(items, process) {    var todo = items.concat();    setTimeout(function() {        process(todo.shift());        if(todo.length > 0) {            setTimeout(arguments.callee, 25);        }    }, 25);}

and then call it with:

processArray([many many elements], function () {lots of work to do});

This would be non-blocking then. The example is taken from High Performance JavaScript.

Another option might be web workers.


If you need an asynchronous-friendly version of Array.forEach and similar, they're available in the Node.js 'async' module: http://github.com/caolan/async ...as a bonus this module also works in the browser.

async.each(openFiles, saveFile, function(err){    // if any of the saves produced an error, err would equal that error});


There is a common pattern for doing a really heavy computation in Node that may be applicable to you...

Node is single-threaded (as a deliberate design choice, see What is Node.js?); this means that it can only utilize a single core. Modern boxes have 8, 16, or even more cores, so this could leave 90+% of the machine idle. The common pattern for a REST service is to fire up one node process per core, and put these behind a local load balancer like http://nginx.org/.

Forking a child - For what you are trying to do, there is another common pattern, forking off a child process to do the heavy lifting. The upside is that the child process can do heavy computation in the background while your parent process is responsive to other events. The catch is that you can't / shouldn't share memory with this child process (not without a LOT of contortions and some native code); you have to pass messages. This will work beautifully if the size of your input and output data is small compared to the computation that must be performed. You can even fire up a child node.js process and use the same code you were using previously.

For example:

var child_process = require('child_process');function run_in_child(array, cb) {    var process = child_process.exec('node libfn.js', function(err, stdout, stderr) {        var output = JSON.parse(stdout);        cb(err, output);    });    process.stdin.write(JSON.stringify(array), 'utf8');    process.stdin.end();}