Is Node.js native Promise.all processing in parallel or sequentially?
Solution 1
Is
Promise.all(iterable)
executing all promises?
No, promises cannot "be executed". They start their task when they are being created - they represent the results only - and you are executing everything in parallel even before passing them to Promise.all
.
Promise.all
does only await multiple promises. It doesn't care in what order they resolve, or whether the computations are running in parallel.
is there a convenient way to run an iterable sequencially?
If you already have your promises, you can't do much but Promise.all([p1, p2, p3, …])
(which does not have a notion of sequence). But if you do have an iterable of asynchronous functions, you can indeed run them sequentially. Basically you need to get from
[fn1, fn2, fn3, …]
to
fn1().then(fn2).then(fn3).then(…)
and the solution to do that is using Array::reduce
:
iterable.reduce((p, fn) => p.then(fn), Promise.resolve())
Solution 2
In parallel
await Promise.all(items.map(async (item) => {
await fetchItem(item)
}))
Advantages: Faster. All iterations will be started even if one fails later on. However, it will "fail fast". Use Promise.allSettled
, to complete all iterations in parallel even if some fail.
In sequence
for (const item of items) {
await fetchItem(item)
}
Advantages: Variables in the loop can be shared by each iteration. Behaves like normal imperative synchronous code.
Solution 3
NodeJS does not run promises in parallel, it runs them concurrently since it’s a single-threaded event loop architecture. There is a possibility to run things in parallel by creating a new child process to take advantage of the multiple core CPU.
In fact, what Promise.all
does is, stacking the promises function in the appropriate queue (see event loop architecture) running them concurrently (call P1, P2,...) then waiting for each result, then resolving the Promise.all with all the promises results.
Promise.all will fail at the first promise which fails unless you have to manage the rejection yourself.
There is a major difference between parallel and concurrent, the first one will run a different computation in a separate process at exactly the same time and they will progress at their rhythm, while the other one will execute the different computation one after another without waiting for the previous computation to finish and progress at the same time without depending on each other.
Finally, to answer your question, Promise.all
will execute neither in parallel nor sequentially but concurrently.
Solution 4
Bergi's answer got me on the right track using Array.reduce
.
However, to actually get the functions returning my promises to execute one after another I had to add some more nesting.
My real use case is an array of files that I need to transfer in order one after another due to limits downstream...
Here is what I ended up with:
getAllFiles().then( (files) => {
return files.reduce((p, theFile) => {
return p.then(() => {
return transferFile(theFile); //function returns a promise
});
}, Promise.resolve()).then(()=>{
console.log("All files transferred");
});
}).catch((error)=>{
console.log(error);
});
As previous answers suggest, using:
getAllFiles().then( (files) => {
return files.reduce((p, theFile) => {
return p.then(transferFile(theFile));
}, Promise.resolve()).then(()=>{
console.log("All files transferred");
});
}).catch((error)=>{
console.log(error);
});
didn't wait for the transfer to complete before starting another and also the "All files transferred" text came before even the first file transfer was started.
Not sure what I did wrong, but wanted to share what worked for me.
Edit: Since I wrote this post I now understand why the first version didn't work. then()
expects a function returning a promise. So, you should pass in the function name without parentheses! Now, my function wants an argument so then I need to wrap in in a anonymous function taking no argument!
Solution 5
You can also process an iterable sequentially with an async function using a recursive function. For example, given an array a
to process with asynchronous function someAsyncFunction()
:
var a = [1, 2, 3, 4, 5, 6]
function someAsyncFunction(n) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log("someAsyncFunction: ", n)
resolve(n)
}, Math.random() * 1500)
})
}
//You can run each array sequentially with:
function sequential(arr, index = 0) {
if (index >= arr.length) return Promise.resolve()
return someAsyncFunction(arr[index])
.then(r => {
console.log("got value: ", r)
return sequential(arr, index + 1)
})
}
sequential(a).then(() => console.log("done"))
Related videos on Youtube
Yanick Rochon
#SOreadytohelp Humans are not efficient. Not because we are not smart, or not able to, but because we do not strive toward the same goals. Because we set ourselves rules to ultimately prevent collaborative efforts. Because we envy power and praise fame. Diversity and collaborative efforts (i.e. not directed) will always bring higher and better results than any set of rules; where everyone can benefit from the work of others, freely. The key here is common sense. Because efforts are wasted, unless they can benefit the collective and not one self. This is not communism nor socialism, as there are rights of property, as credits go where they belong. This is merely simple wisdom.
Updated on July 23, 2022Comments
-
Yanick Rochon almost 2 years
I would like to clarify this point, as the documentation is not too clear about it;
Q1: Is
Promise.all(iterable)
processing all promises sequentially or in parallel? Or, more specifically, is it the equivalent of running chained promises likep1.then(p2).then(p3).then(p4).then(p5)....
or is it some other kind of algorithm where all
p1
,p2
,p3
,p4
,p5
, etc. are being called at the same time (in parallel) and results are returned as soon as all resolve (or one rejects)?Q2: If
Promise.all
runs in parallel, is there a convenient way to run an iterable sequencially?Note: I don't want to use Q, or Bluebird, but all native ES6 specs.
-
Amit almost 9 yearsAre you asking about node (V8) implementation, or about the spec?
-
royhowie almost 9 yearsI'm pretty sure
Promise.all
executes them in parallel. -
Yanick Rochon almost 9 years@Amit I flagged
node.js
andio.js
as this is where I'm using it. So, yes, the V8 implementation if you will. -
Bergi almost 9 yearsPromises cannot "be executed". They start their task when they are being created - they represent the results only - and you are executing everything in parallel even before passing them to
Promise.all
. -
Mateon1 almost 9 yearsPromises are executed at the moment of creation. (can be confirmed by running a bit of code). In
new Promise(a).then(b); c();
a is executed first, then c, then b. It isn't Promise.all that runs these promises, it just handles when they resolve. -
Admin almost 8 yearsJust for clarification: The only portion of a
Promise
that gets executed (immediately) is the executor, so the function you pass to thePromise
constructor. IfPromise.all
awaits the resolving of all givenPromise
s (or the rejection of one) it wouldn't make much sense, if these were settled sequentially. -
rocketspacer over 7 yearsThey are executed in the order they were declared cause most javascript environment run single threaded. So declaring p1 before p2 and calling Promise.all([ p2, p1 ]) wouldn't help
-
-
James Reategui over 8 yearsIn this example, is iterable an array of the functions that return a promise that you want to call?
-
Bergi over 8 years@JamesReategui: Yes, exactly, that's what I meant by "an iterable of asynchronous functions"
-
Yanick Rochon about 8 yearsAt this time,
async
/await
is only available with a transpiler, or using other engines than Node. Also, you really should not mixasync
withyield
. Whle they act the same with a transpiler andco
, they really are quite different and should not ordinarily substitude each other. Also, you should mention these restrictions as your answer is confusing to novice programmers. -
SSH This almost 8 years
iterable.reduce((p, fn) => p.then(fn), Promise.resolve());
where can you put your code that runs when the last promise resolves? -
Bergi almost 8 years@SSHThis: It's exactly as the
then
sequence - the return value is the promise for the lastfn
result, and you can chain other callbacks to that. -
TimoSolo over 7 yearsi'm struggling to figure out the reduce. Can you give a real code example with the fn1, fn2, fn3 run sequentially on an array of objects?
-
Bergi over 7 years@TimoSolo:
objects.reduce((p, o) => p.then(()=>fn(o)). Promise.resolve())
-
wojjas over 7 yearsIf you need to use the value from fn1 in fn2, for instance, then do it by letting
fn1().then(...
return a promise by calling fn2, thus keeping the chain. Like so:fn1().then( retValFromF1 => {return p2(retValFromF1)}).then(fn3).catch(...
-
Bergi over 7 years@wojjas That's exactly equivalent to
fn1().then(p2).then(fn3).catch(…
? No need to use a function expression. -
wojjas over 7 years@Bergi Using a function expression makes it possible to pass variables. Without it there will be no retValFromF1 to pass to p2. Can it be done in some other/better way? The
return
is not needed if there is only one statement in the function as in my example. But as soon as there are more than one ap2(retValFromF1)
is needed. -
Bergi over 7 years@wojjas Of course the
retValFromF1
is passed intop2
, that's exactly whatp2
does. Sure, if you want to do more (pass additional variables, call multiple functions, etc) you need to use a function expression, though changingp2
in the array would be easier -
Robert Penner about 6 yearsOr:
for (const item of items) await fetchItem(item);
-
Mateusz Sowiński almost 6 yearsusing
array.prototype.reduce
is much better in terms of performance than a recursive function -
Mark almost 6 years@MateuszSowiński, there is a 1500ms timeout between each call. Considering that this is doing async calls sequentially, it’s hard to see how that’s relevant even for a very quick async turnaround.
-
Mateusz Sowiński almost 6 yearsLet's say you have to execute 40 of really quick async functions after each other - using recursive functions would clog your memory pretty fast
-
Mark almost 6 years@MateuszSowiński, that the stack doesn't wind up here...we're returning after each call. Compare that with
reduce
where you have to build the entirethen()
chain in one step and then execute. -
Mateusz Sowiński almost 6 yearsIn the 40th call of the sequential function the first call of the function is still in memory waiting for the chain of sequential functions to return
-
Mark almost 6 yearsI'll refer you to this thread if you really want to discuss the nuances of this @MateuszSowiński: stackoverflow.com/questions/29925948/…
-
robe007 over 5 years@Bergi Can I say that:
iterable.reduce((p, fn) => p.then(fn), Promise.resolve())
is equivalent to:[fn1, fn2, fn3].reduce((p, fn) => p.then(fn), Promise.resolve())
??? -
Bergi over 5 years@robe007 Yes, I meant that
iterable
is the[fn1, fn2, fn3, …]
array -
robe007 over 5 years@Bergi And ... every
fn...
it is a function that returns a promise??? -
Bergi over 5 years@robe007 Yes, they're asynchronous functions.
-
Taimoor over 5 years@david_adler In parallel example advantages you said All iterations will be executed even if one fails. If I'm not wrong this would still fail fast. To change this behaviour one can do something like:
await Promise.all(items.map(async item => { return await fetchItem(item).catch(e => e) }))
-
david_adler about 5 years@Taimoor yes it does "fail fast" and continue executing code after the Promise.all but all iterations are still executed codepen.io/mfbx9da4/pen/BbaaXr
-
HaneTV about 5 years
iterable.reduce((p, fn) => p.then(fn), Promise.resolve())
gave me a headache in nodeJS and seems to have a different behaviour than doing it in 2 lines, eg :ps.reduce((p, fn) => { p.then(fn); return Promise.resolve(); });
-
Bergi about 5 years@HaneTV The comma separates arguments:
iterable.reduce((p, fn) => { return p.then(fn); }, Promise.resolve())
-
Giulio Caccin almost 5 yearsIs this an answer to the original question?
-
mandarin over 4 yearsThis approach is better, when the
async
function is an API call an you don't want to DDOS the server. You have better control over the individual results and errors thrown in the execution. Even better you can decide on what errors to continue and on what to break the loop. -
Shihab almost 4 yearsThis is not right. NodeJS can run things in parallel. NodeJS has a concept of worker thread. By default the number of worker thread is 4. For example, if you use crypto library to hash two values then you can execute them in parallel. Two worker threads will handle the task. Of course, your CPU has to be multi-core to support parallelism.
-
Adrien De Peretti almost 4 yearsYeah you right, it’s what i said at the end of the first paragraph, but i talked about child process, of course they can run workers.
-
david_adler almost 4 yearsNote that javascript isn't actually executing the asynchronous requests in "parallel" using threads since javascript is single threaded. developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop
-
Bernardo Dal Corno over 3 yearsyou could add a third parameter with an accumulator of type array, and return it in the last
Promise.resolve()
- will be in the same format as Promise.all but sequentially -
Bernardo Dal Corno over 3 yearsOne thing I dont like about this way is the need to create an "extra" function (
sequential
) for it to work, in comparison tofor
andreduce
solutions. But it could very much be in a tool like lodash, for example -
Bernardo Dal Corno over 3 yearsfor completeness, you should make
a
an array of promises, and then changesomeAsyncFunction()
to simplyarr[index]()
, without forgeting to useawait
-
twhitehead over 3 yearsif you need the results for the parallel version: let results = await Promise.all(items.map(async item => { return await fetchItem(item) })); and re the comment about not being in "parallel", if the requests call an external function such as fetching a file or api call then they ARE truly parallel despite the javascript engine being single threaded.
-
david_adler over 3 years@twhitehead yes the engine is multithreaded but ultimately all comes back to javascripts main thread for serialization of the responses. By truly parallel, you mean new threads are spawned for each request in the engine under the hood? Is that how all engines work? Is that part of the spec? Or is that just the only way things could work? Could you point to something in the spec by any chance?
-
Muhammad Muzammil over 3 yearsBest answer so far. I was so confused that how a single-threaded architecture like Node.js could run multiple promises in parallel. Thanks alot sir. P.S. I know how worker threads are and how they work but promises are resolved by Node.js event-loop itself and not by using libuv. So the best Node.js could do is to execute them (promises) concurrently.
-
Dmitriy Mozgovoy about 3 yearsThe defined async function with await inside the map method is redundant.
-
david_adler almost 3 years@DmitriyMozgovoy yes it is but it will be more obvious how to extend it for people new to Promise.all
-
david_adler over 2 yearsUse
Promise.allSettled
if you want all items to be executed (and not just started) regardless if one fails. -
kehers about 2 yearsIn the parallel example, is the
await
inawait fetchItem(item)
necessary? Why notawait Promise.all(items.map(item => fetchItem(item)))
-
david_adler about 2 yearsIt's not necessary but I left it in for illustrative purposes as it's easy to see how to modify code inside the inner arrow function.
-
matttm about 2 years@david_adler i actually think using the inner await makes it synchronous
-
david_adler about 2 yearsIn what way synchronous? Each inner callback will be executed concurrently with respect to each other.