JavaScript, Node.js: is Array.forEach asynchronous?
Solution 1
No, it is blocking. Have a look at the specification of the algorithm.
However a maybe easier to understand implementation is given on MDN:
if (!Array.prototype.forEach)
{
Array.prototype.forEach = function(fun /*, thisp */)
{
"use strict";
if (this === void 0 || this === null)
throw new TypeError();
var t = Object(this);
var len = t.length >>> 0;
if (typeof fun !== "function")
throw new TypeError();
var thisp = arguments[1];
for (var i = 0; i < len; i++)
{
if (i in t)
fun.call(thisp, t[i], i, t);
}
};
}
If you have to execute a lot of code for each element, you should consider to use a different approach:
function processArray(items, process) {
var todo = items.concat();
setTimeout(function() {
process(todo.shift());
if(todo.length > 0) {
setTimeout(arguments.callee, 25);
}
}, 25);
}
and then call it with:
processArray([many many elements], function () {lots of work to do});
This would be non-blocking then. The example is taken from High Performance JavaScript.
Another option might be web workers.
Solution 2
If you need an asynchronous-friendly version of Array.forEach
and similar, they're available in the Node.js 'async' module: http://github.com/caolan/async ...as a bonus this module also works in the browser.
async.each(openFiles, saveFile, function(err){
// if any of the saves produced an error, err would equal that error
});
Solution 3
There is a common pattern for doing a really heavy computation in Node that may be applicable to you...
Node is single-threaded (as a deliberate design choice, see What is Node.js?); this means that it can only utilize a single core. Modern boxes have 8, 16, or even more cores, so this could leave 90+% of the machine idle. The common pattern for a REST service is to fire up one node process per core, and put these behind a local load balancer like http://nginx.org/.
Forking a child - For what you are trying to do, there is another common pattern, forking off a child process to do the heavy lifting. The upside is that the child process can do heavy computation in the background while your parent process is responsive to other events. The catch is that you can't / shouldn't share memory with this child process (not without a LOT of contortions and some native code); you have to pass messages. This will work beautifully if the size of your input and output data is small compared to the computation that must be performed. You can even fire up a child node.js process and use the same code you were using previously.
For example:
var child_process = require('child_process'); function run_in_child(array, cb) { var process = child_process.exec('node libfn.js', function(err, stdout, stderr) { var output = JSON.parse(stdout); cb(err, output); }); process.stdin.write(JSON.stringify(array), 'utf8'); process.stdin.end(); }
Solution 4
Array.forEach
is meant for computing stuff not waiting, and there is nothing to be gained making computations asynchronous in an event loop (webworkers add multiprocessing, if you need multi-core computation). If you want to wait for multiple tasks to end, use a counter, which you can wrap in a semaphore class.
Solution 5
Edit 2018-10-11: It looks like there is a good chance the standard described below may not go through, consider pipelineing as an alternative (does not behave exactly the same but methods could be implemented in a similar manor).
This is exactly why I am excited about es7, in future you will be able to do something like the code below (some of the specs are not complete so use with caution, I will try to keep this up to date). But basically using the new :: bind operator, you will be able to run a method on an object as if the object's prototype contains the method. eg [Object]::[Method] where normally you would call [Object].[ObjectsMethod]
Note to do this today (24-July-16) and have it work in all browsers you will need to transpile your code for the following functionality:Import / Export, Arrow functions, Promises, Async / Await and most importantly function bind. The code below could be modfied to use only function bind if nessesary, all this functionality is neatly available today by using babel.
YourCode.js (where 'lots of work to do' must simply return a promise, resolving it when the asynchronous work is done.)
import { asyncForEach } from './ArrayExtensions.js';
await [many many elements]::asyncForEach(() => lots of work to do);
ArrayExtensions.js
export function asyncForEach(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
for(let i=0;i<ar.length;i++)
{
await callback.call(ar, ar[i], i, ar);
}
});
};
export function asyncMap(callback)
{
return Promise.resolve(this).then(async (ar) =>
{
const out = [];
for(let i=0;i<ar.length;i++)
{
out[i] = await callback.call(ar, ar[i], i, ar);
}
return out;
});
};
Admin
Updated on July 08, 2022Comments
-
Admin almost 2 years
I have a question regarding the native
Array.forEach
implementation of JavaScript: Does it behave asynchronously? For example, if I call:[many many elements].forEach(function () {lots of work to do})
Will this be non-blocking?
-
Admin about 13 yearsThank you for the quick reply! Although I will try myself to find some alternative to that on the internet, anyone maybe who has an idea how to implement an async version of that function? would. for instance, a wrapping around setTimeout be enough?
-
Marcello Bastea-Forte about 13 yearsIf you're using Node.js, also consider using process.nextTick instead of setTimeout
-
Dave Dopson over 12 yearstechnically, forEach isn't "blocking", as the CPU never goes to sleep. It's synchronous and CPU-bound, which can feel like "blocking" when you expect the node app to be responsive to events.
-
Brad over 11 yearsJust to be clear... Node isn't single threaded, but the execution of your JavaScript is. IO and what not runs on separate threads.
-
Dave Dopson over 11 years@Brad - maybe. that's implementation dependent. With appropriate kernel support, the interface between Node and the kernel can be event-based - kqueue (mac), epoll (linux), IO completion ports (windows). As a fallback, a pool of threads also works. Your basic point is right though. The low-level Node implementation might have multiple threads. But they will NEVER directly expose them to JS userland as that would break the entire language model.
-
Brad over 11 yearsCorrect, I'm just clarifying because the concept has confused many.
-
James almost 10 yearsasync would be probably a more appropriate solution here (in fact just seen someone posted that as an answer!).
-
Giles Williams over 8 yearsHow is this asynchronous? AFAIK #call will execute immediately?
-
Rax Wunter over 8 yearsOf course immediately, but you have callback function to know when all iterations be completed. Here "iterator" argument is a node-style async function with callback. It's similar to async.each method
-
dCoder over 8 yearstodo.pop() [O(1)] in place of todo.shift() [O(N)] might be efficient. ref : (stackoverflow.com/a/22615787/3343029)
-
Richard almost 8 yearsI trusted this answer, but it seems to be wrong in some cases.
forEach
does not block onawait
statements for instance and you should rather use afor
loop: stackoverflow.com/questions/37962880/… -
Felix Kling almost 8 years@Richard: of course. You can only use
await
insideasync
functions. ButforEach
doesn't know what async functions are. Keep in mind that async functions are just functions returning a promise. Would you expectforEach
to handle a promise returned from the callback?forEach
completely ignores the return value from the callback. It would only be able to handle an async callback if it was async itself. -
Felix Kling almost 8 years@Richard in other words,
forEach
is synchronous, so you cannot pass an asynchronous callback to it. That never worked andasync/await
doesn't change that. Usingawait
inside afor
loop only works because the containing function is converted to a generator, which can be suspended until the promise is resolved (awaited). This answer isn't wrong, rather you seem to have wrong expectations of what async/await does. -
adrianvlupu almost 8 yearsI don't see how this is async. call or apply are synchronous. Having a callback doesn't make it async
-
robskrob almost 7 years@FelixKling I read the spec that you linked but could not find the part that speaks to
forEach
being synchronous of asynchronous. Where in the spec does it provide an answer? -
Felix Kling almost 7 years@robertjewell: The algorithm itself is synchronous. It describes a loop where the callback is called in each iteration.
-
matpop almost 6 yearsIf you need to ensure that the async opeartion is run for only one item at a time (in the order of the collection), you must use
eachSeries
instead. -
Sudhanshu Gaur over 5 yearsThis will happen even if you will write async.foreach or any other parallel method. Because as for loop is not an IO process Nodejs will always do it synchronously.
-
kipper_t about 5 yearsThe link to the algorithm specification does not take you to the algorithm specification. It takes you to what looks to be a weird website.
-
Felix Kling about 5 years@kipper_t: The spec is weird.
-
Xsmael over 4 years@JohnKennedy I've seen you before!
-
vasilevich over 4 yearsin javascript when people say async, they mean that the code execution does not block the main event loop (aka, it does not make the proccess stuck at one line of code). just putting a callback does not make code async, it has to utilize some form of event loop releasing such as a setTimeout, or setInterval. since durning the time you wait for those, other code can run without interruptions.
-
steampowered over 4 yearsThis article from the current maintainer of Mongoose has an excellent section on the use of async/await inside various looping constructs thecodebarbarian.com/…
-
Sergio A. over 4 years@Richard thanks god you mentioned it. I was about to get crazy all over it
-
Kasra about 3 yearsworks perfectly in React Native with PusherJS. What is the role of
setTimeout(arguments.callee, 25)
? I mean. does it have any effect on the blocking issue? -
Felix Kling about 3 years@Kasra: Yes. Instead of processing all elements at once, and potentially blocking the main thread, each element is processed in a separate job, given the browser time in between to do something else.
-
pmont about 2 yearsIt's misleading to say that Node.js is single-threaded. There's a lot of technicalities here. The Javascript interpreter is single-threaded, but the IO subsystem (which is part of node) is multi-threaded. Async/await (aka promises) invokes parallel threads. Additionally, worker threads allow multiple Javascript threads to run in parallel.