bash 并行运行多个 curl 命令

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/46362284/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-18 16:28:18  来源:igfitidea点击:

Run multiple curl commands in parallel

bashshellcurl

提问by Saeed Mohtasham

I have the following shell script. The issue is that I want to run the transactions parallel/concurrently without waiting for one request to finish to go to the next request. For example if I make 20 requests, I want them to be executed at the same time.

我有以下 shell 脚本。问题是我想并行/并发运行事务,而无需等待一个请求完成才能转到下一个请求。例如,如果我发出 20 个请求,我希望它们同时执行。

for ((request=1;request<=20;request++))
do
    for ((x=1;x<=20;x++))
    do
        time curl -X POST --header "http://localhost:5000/example"
    done
done

Any guide?

任何指南?

回答by anubhava

Using xargs -Poption, you can run any command in parallel:

使用xargs -P选项,您可以并行运行任何命令:

xargs -I % -P 8 curl -X POST --header "http://localhost:5000/example" \
< <(printf '%s\n' {1..400})

This will run give curlcommand 400 times with max 8 jobs in parallel.

这将curl并行运行最多 8 个作业的命令 400 次。

回答by Saeed Mohtasham

You can use xargswith -Poption to run any command in parallel:

您可以使用xargswith-P选项并行运行任何命令:

seq 1 200 | xargs -n1 -P10  curl "http://localhost:5000/example"

This will run curlcommand 200 times with max 10 jobs in parallel.

这将运行curl命令 200 次,最多并行 10 个作业。

回答by Gaétan RYCKEBOER

Add “wait” at the end, and background them.

在末尾添加“等待”,并将它们作为背景。

for ((request=1;request<=20;request++))
do
    for ((x=1;x<=20;x++))
    do
        time curl -X POST --header "http://localhost:5000/example" &
    done
done

wait

They will all output to the same stdout, but you can redirect the result of the time (and stdout and stderr) to a named file:

它们都将输出到相同的标准输出,但您可以将时间的结果(以及标准输出和标准错误)重定向到命名文件:

time curl -X POST --header "http://localhost:5000/example" > output.${x}.${request}.out 2>1 &

回答by Aman Garg

Adding to @saeed'sanswer, I created a generic function that utilises function arguments to fire commands for a total of Ntimes in Mjobs at a parallel

添加到@saeed's答案,我创建了一个利用函数参数火的命令共泛型函数NM在并行作业

function conc(){
    cmd=("${@:3}")
    seq 1 "" | xargs -n1 -P"" "${cmd[@]}"
}
$ conc N M cmd
$ conc 10 2 curl --location --request GET 'http://google.com/'

This will fire 10curl commands at a max parallelism of two each.

这将10在每个最大并行度为两个时触发curl 命令。

Adding this function to the bash_profile.rcmakes it easier.

将此功能添加到 中bash_profile.rc使其更容易。