限制 Node.js 中的异步调用
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/9539886/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Limiting asynchronous calls in Node.js
提问by Jon Nichols
I've got a Node.js app that gets a list of file locally and uploads them to a server. This list could contain thousands of files.
我有一个 Node.js 应用程序,它在本地获取文件列表并将它们上传到服务器。此列表可能包含数千个文件。
for (var i = 0; i < files.length; i++) {
upload_file(files[i]);
}
If I execute this with thousands of files, upload_file will get called thousands of times all at once, and most likely die (or at least struggle). In the synchronous world, we'd create a thread pool and limit it to a certain number of threads. Is there a simple way to limit how many asynchronous calls get executed at once?
如果我对数千个文件执行此操作,upload_file 将一次被调用数千次,并且很可能会死(或至少是挣扎)。在同步世界中,我们会创建一个线程池并将其限制为一定数量的线程。有没有一种简单的方法来限制一次执行多少异步调用?
回答by Linus Gustav Larsson Thiel
As usual, I recommend Caolan McMahon's async module.
像往常一样,我推荐 Caolan McMahon 的async 模块。
Make your upload_filefunction take a callback as it's second parameter:
使您的upload_file函数接受回调,因为它是第二个参数:
var async = require("async");
function upload_file(file, callback) {
// Do funky stuff with file
callback();
}
var queue = async.queue(upload_file, 10); // Run ten simultaneous uploads
queue.drain = function() {
console.log("All files are uploaded");
};
// Queue your files for upload
queue.push(files);
queue.concurrency = 20; // Increase to twenty simultaneous uploads
回答by Wes Johnson
The answer above, re: asyncon NPM is the best answer, but if you'd like to learn more about control flow:
上面的答案,re: asyncon NPM 是最好的答案,但如果您想了解有关控制流的更多信息:
You should look into control flow patterns. There's a wonderful discussion on control flow patterns in Chapter 7 of Mixu's Node Book. Namely, I'd look at the example in 7.2.3: Limited parallel - an asynchronous, parallel, concurrency limited for loop.
您应该研究控制流模式。Mixu 的 Node Book 第 7 章对控制流模式进行了精彩的讨论。也就是说,我会查看 7.2.3: Limited parallel - a asynchronous, parallel, concurrency limited for loop中的示例。
I've adapted his example:
我改编了他的例子:
function doUpload() {
// perform file read & upload here...
}
var files = [...];
var limit = 10; // concurrent read / upload limit
var running = 0; // number of running async file operations
function uploader() {
while(running < limit && files.length > 0) {
var file = files.shift();
doUpload(file, function() {
running--;
if(files.length > 0)
uploader();
});
running++;
}
}
uploader();
回答by jwueller
You should try queueing. I assume that a callback is fired when upload_file()finishes. Something like this should do the trick (untested):
你应该尝试排队。我假设upload_file()完成时会触发回调。这样的事情应该可以解决问题(未经测试):
function upload_files(files, maxSimultaneousUploads, callback) {
var runningUploads = 0,
startedUploads = 0,
finishedUploads = 0;
function next() {
runningUploads--;
finishedUploads++;
if (finishedUploads == files.length) {
callback();
} else {
// Make sure that we are running at the maximum capacity.
queue();
}
}
function queue() {
// Run as many uploads as possible while not exceeding the given limit.
while (startedUploads < files.length && runningUploads < maxSimultaneousUploads) {
runningUploads++;
upload_file(files[startedUploads++], next);
}
}
// Start the upload!
queue();
}
回答by Arun Ghosh
The others answers seem to be outdated. This can be solved easily using paralleLimitfrom async. Below is how to use it. I haven't tested it.
其他答案似乎已经过时。这可以很容易地使用可以解决paralleLimit从异步。下面是如何使用它。我没有测试过。
var tasks = files.map(function(f) {
return function(callback) {
upload_file(f, callback)
}
});
parallelLimit(tasks, 10, function(){
});
回答by Anton Fil
It can be resolved using recursion.
可以使用递归解决。
The idea is that initially you send maximum allowed number of requests and each of these requests should recursively continue to send itself on its completion.
这个想法是,最初您发送最大允许数量的请求,并且这些请求中的每一个都应在完成时递归地继续发送自身。
function batchUpload(files, concurrentRequestsLimit) {
return new Promise(resolve => {
var responses = [];
var index = 0;
function recursiveUpload() {
if (index === files.length) {
return;
}
upload_file(files[index++]).then(r => {
responses.push(r);
if (responses.length === files.length) {
resolve(responses);
} else {
recursiveUpload();
}
});
}
for (var i = 0; i < concurrentRequestsLimit; i++) {
recursiveUpload();
}
});
}
var files = [
'file_1',
'file_2',
'file_3',
...
'file_100'
];
batchUpload(files, 5).then(responses => {
console.log(responses);
});

