javascript Node.js:执行多个异步操作的最佳方式,然后做其他事情?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/26268651/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Node.js: Best way to perform multiple async operations, then do something else?
提问by dannybrown
In the following code I am trying to make multiple (around 10) HTTP requests and RSS parses in one go.
在下面的代码中,我试图一次性发出多个(大约 10 个)HTTP 请求和 RSS 解析。
I am using the standard forEach
construct on an array of URIs I need to access and parse the result of.
我在forEach
需要访问和解析结果的 URI 数组上使用标准构造。
Code:
代码:
var articles;
feedsToFetch.forEach(function (feedUri)
{
feed(feedUri, function(err, feedArticles)
{
if (err)
{
throw err;
}
else
{
articles = articles.concat(feedArticles);
}
});
});
// Code I want to run once all feedUris have been visited
I understand that when calling a function once I should be using a callback. However, the only way I can think of using a callback in this example would be to call a function which counts how many times it has been called and only continues when it has been called the same amount of times as feedsToFetch.length
which seems hacky.
我知道在调用函数时我应该使用回调。然而,我能想到在这个例子中使用回调的唯一方法是调用一个函数,该函数计算它被调用的次数,并且只有当它被调用的次数与feedsToFetch.length
看起来很hacky的次数相同时才继续。
So my question is, what is the best way to handle this type of situation in node.js.
所以我的问题是,在 node.js 中处理这种情况的最佳方法是什么。
Preferably without any form of blocking! (I still want that blazing fast speed). Is it promises or something else?
最好没有任何形式的阻塞!(我仍然想要那种极快的速度)。是承诺还是别的什么?
Thanks, Danny
谢谢,丹尼
回答by Thank you
No hacks necessary
不需要黑客
I would recommend using the asyncmodule as it makes these kinds of things a lot easier.
我建议使用async模块,因为它使这些事情变得更容易。
async
provides async.eachSeriesas an async replacement for arr.forEach
and allows you to pass a done
callback function when it's complete. It will process each items in a series, just as forEach
does. Also, it will conveniently bubble errors to your callback so that you don't have to have handler logic inside the loop. If you want/require parallelprocessing, you can use async.each.
async
提供async.eachSeries作为异步替代品,arr.forEach
并允许您在done
完成时传递回调函数。它将处理一系列中的每个项目,就像这样forEach
做一样。此外,它会方便地将错误冒泡到您的回调中,这样您就不必在循环内有处理程序逻辑。如果你想要/需要并行处理,你可以使用async.each。
There will be no blockingbetween the async.eachSeries
call and the callback.
会有不堵之间async.eachSeries
调用和回调。
async.eachSeries(feedsToFetch, function(feedUri, done) {
// call your async function
feed(feedUri, function(err, feedArticles) {
// if there's an error, "bubble" it to the callback
if (err) return done(err);
// your operation here;
articles = articles.concat(feedArticles);
// this task is done
done();
});
}, function(err) {
// errors generated in the loop above will be accessible here
if (err) throw err;
// we're all done!
console.log("all done!");
});
Alternatively, you could build an array of async operations and pass them to async.series. Series will process your results in a series(not parallel) and call the callback when each function is complete. The only reason to use this over async.eachSeries
would be if you preferred the familiar arr.forEach
syntax.
或者,您可以构建一组异步操作并将它们传递给async.series。Series 将按系列(而非并行)处理您的结果,并在每个函数完成时调用回调。使用它的唯一原因async.eachSeries
是如果您更喜欢熟悉的arr.forEach
语法。
// create an array of async tasks
var tasks = [];
feedsToFetch.forEach(function (feedUri) {
// add each task to the task array
tasks.push(function() {
// your operations
feed(feedUri, function(err, feedArticles) {
if (err) throw err;
articles = articles.concat(feedArticles);
});
});
});
// call async.series with the task array and callback
async.series(tasks, function() {
console.log("done !");
});
Or you can Roll Your Own?
或者你可以自己滚动?
Perhaps you're feeling extra ambitious or maybe you don't want to rely upon the async
dependency. Maybe you're just bored like I was. Anyway, I purposely copied the API of async.eachSeries
to make it easy to understand how this works.
也许你觉得自己更有野心,或者你不想依赖这种async
依赖。也许你只是像我一样无聊。无论如何,我特意复制了 的 APIasync.eachSeries
以便于理解它是如何工作的。
Once we remove the comments here, we have just 9 lines of codethat can be reused for anyarray we want to process asynchronously! It will not modify the original array, errors can be sent to "short circuit" the iteration, and a separate callback can be used. It will also work on empty arrays. Quite a bit of functionality for just 9 lines :)
一旦我们删除了这里的注释,我们只有9 行代码可以重用于我们想要异步处理的任何数组!它不会修改原始数组,错误可以发送到“短路”迭代,并且可以使用单独的回调。它也适用于空数组。仅 9 行就具有相当多的功能:)
// void asyncForEach(Array arr, Function iterator, Function callback)
// * iterator(item, done) - done can be called with an err to shortcut to callback
// * callback(done) - done recieves error if an iterator sent one
function asyncForEach(arr, iterator, callback) {
// create a cloned queue of arr
var queue = arr.slice(0);
// create a recursive iterator
function next(err) {
// if there's an error, bubble to callback
if (err) return callback(err);
// if the queue is empty, call the callback with no error
if (queue.length === 0) return callback(null);
// call the callback with our task
// we pass `next` here so the task can let us know when to move on to the next task
iterator(queue.shift(), next);
}
// start the loop;
next();
}
Now let's create a sample async function to use with it. We'll fake the delay with a setTimeout
of 500 ms here.
现在让我们创建一个示例异步函数来使用它。我们将在setTimeout
这里用500 毫秒来伪造延迟。
// void sampleAsync(String uri, Function done)
// * done receives message string after 500 ms
function sampleAsync(uri, done) {
// fake delay of 500 ms
setTimeout(function() {
// our operation
// <= "foo"
// => "async foo !"
var message = ["async", uri, "!"].join(" ");
// call done with our result
done(message);
}, 500);
}
Ok, let's see how they work !
好的,让我们看看它们是如何工作的!
tasks = ["cat", "hat", "wat"];
asyncForEach(tasks, function(uri, done) {
sampleAsync(uri, function(message) {
console.log(message);
done();
});
}, function() {
console.log("done");
});
Output (500 ms delay before each output)
输出(每个输出前 500 毫秒延迟)
async cat !
async hat !
async wat !
done
回答by aarosil
HACK-FREE SOLUTION
无黑客解决方案
Promises to be included in next JavaScript version
The popular Promise libraries give you an .all()
method for this exact use case (waiting for a bunch of async calls to complete, then doing something else). It's the perfect match for your scenario
流行的 Promise 库为您提供了.all()
用于此确切用例的方法(等待一堆异步调用完成,然后执行其他操作)。它非常适合您的场景
Bluebird also has .map()
, which can take an array of values and use it to start a Promise chain.
Bluebird 也有.map()
,它可以接受一组值并使用它来启动一个 Promise 链。
Here is an example using Bluebird .map()
:
这是一个使用 Bluebird 的示例.map()
:
var Promise = require('bluebird');
var request = Promise.promisifyAll(require('request'));
function processAllFeeds(feedsToFetch) {
return Promise.map(feedsToFetch, function(feed){
// I renamed your 'feed' fn to 'processFeed'
return processFeed(feed)
})
.then(function(articles){
// 'articles' is now an array w/ results of all 'processFeed' calls
// do something with all the results...
})
.catch(function(e){
// feed server was down, etc
})
}
function processFeed(feed) {
// use the promisified version of 'get'
return request.getAsync(feed.url)...
}
Notice also that you don't need to use closure here to accumulate the results.
另请注意,您不需要在此处使用闭包来累积结果。
The Bluebird API Docsare really well written too, with lots of examples, so it makes it easier to pick up.
在蓝鸟API文档都写得很好过,有很多的例子,所以它可以更容易回升。
Once I learned Promise pattern, it made life so much easier. I can't recommend it enough.
一旦我学会了 Promise 模式,它让生活变得更加轻松。我不能推荐它。
Also, here is a great articleabout different approaches to dealing with async functions using promises, the async
module, and others
此外,这里有一篇很棒的文章,介绍了使用 promise、async
模块等处理异步函数的不同方法
Hope this helps!
希望这可以帮助!
回答by dandavis
using a copy of the list of url as a queue to keep track of arrivals makes it simple: (all changes commented)
使用 url 列表的副本作为队列来跟踪到达情况很简单:(所有更改都已注释)
var q=feedsToFetch.slice(); // dupe to censor upon url arrival (to track progress)
feedsToFetch.forEach(function (feedUri)
{
feed(feedUri, function(err, feedArticles)
{
if (err)
{
throw err;
}
else
{
articles = articles.concat(feedArticles);
}
q.splice(q.indexOf(feedUri),1); //remove this url from list
if(!q.length) done(); // if all urls have been removed, fire needy code
});
});
function done(){
// Code I want to run once all feedUris have been visited
}
in the end, this is not that much "dirtier" than promises, and provides you chance to reload un-finished urls (a counter alone wont tell you which one(s) failed). for this simple parallel download task, it's actually going to add more code to your project implement Promises than a simple queue, and Promise.all() is not in the most intuitive place to stumble across. Once you get into sub-sub-queries, or want better error handling than a trainwreck, i strongly recommend using Promises, but you don't need a rocket launcher to kill a squirrel...
最后,这并不比承诺“更脏”,并且为您提供重新加载未完成的网址的机会(单独的计数器不会告诉您哪个(些)失败了)。对于这个简单的并行下载任务,它实际上会为您的项目添加更多代码来实现 Promises 而不是一个简单的队列,而 Promise.all() 并不是最直观的地方。一旦你进入子子查询,或者想要比火车残骸更好的错误处理,我强烈建议使用 Promises,但你不需要火箭发射器来杀死松鼠......