bash Curl:不要等待响应

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/37731609/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-08 22:19:56  来源:igfitidea点击:

Curl: don't wait for response

bashshell

提问by tuxlu

I currently have a shell script which relies on a curl command like this:

我目前有一个 shell 脚本,它依赖于这样的 curl 命令:

curl --request POST -u name:pass -H "Content-Type: application/json"
 --data "{data}" https://url.com --cacert ./my_crt

I don't need the response of the command, and this command is in a big for loop, so waiting for the responses take a lot of time.

我不需要命令的响应,而且这个命令在一个很大的for循环中,所以等待响应需要很多时间。

so, is there a way in bash to do exactly the same thing, but without waiting for the response?

那么,bash 中是否有一种方法可以做完全相同的事情,但无需等待响应?

回答by that other guy

If you have a large number of requests you want to issue quickly, and you don't care about the output, there are two things you should do:

如果你有大量的请求想要快速发出,并且你不关心输出,那么你应该做两件事:

  1. Do more requests with the same connection.
  1. 使用相同的连接执行更多请求。

For small requests, it's generally much faster to do 10 requests each on 1 connection, than 1 request each on 10 connections. For Henry's HTTP post test server, the difference is 2.5x:

对于小请求,通常在 1 个连接上分别执行 10 个请求比在 10 个连接上每个请求快得多。对于Henry的HTTP post test server,相差2.5x:

$ time for i in {1..10}; do
    curl -F foo=bar https://posttestserver.com/post.php ;
  done
Successfully dumped 1 post variables.
View it at http://www.posttestserver.com/data/2016/06/09/11.44.48536583865
Post body was 0 chars long.
(...)
real    0m2.429s

vs

对比

$ time  {
    array=();
    for i in {1..10}; do
      array+=(--next -F foo=bar https://posttestserver.com/post.php ) ; 
    done; 
    curl "${array[@]}";
 }
Successfully dumped 1 post variables.
View it at http://www.posttestserver.com/data/2016/06/09/11.45.461371907842
(...)
real    0m1.079s
  1. Process at most a N connections in parallel, to avoid DoS'ing the host or your machine
  1. 最多并行处理 N 个连接,以避免对主机或您的机器进行 DoS

Here semfrom GNU parallel is limiting the number of parallel connections to 4. This is a better version of backgrounding and waiting, since it will always ensure full capacity.

这里sem来自 GNU parallel 将并行连接的数量限制为 4。这是后台和等待的更好版本,因为它始终确保满容量。

for i in {1..20}
do 
  sem -j 4 curl -F foo=bar https://posttestserver.com/post.php
done
sem --wait

The number of parallel requests you want depends on how beefy the host is. A realistic number could be 32+

您想要的并行请求数量取决于主机的强大程度。一个现实的数字可能是 32+

Combine the two strategies, and you should see a hefty speedup without DoS'ing yourself.

结合这两种策略,您应该会在不进行 DoS 的情况下看到显着的加速。

回答by razz

You could just background it with &and to prevent output you can redirect stdoutand stderrto to /dev/null.

你可以只将其与背景&,并防止你可以重定向输出stdoutstderr/dev/null

curl --request POST -u name:pass -H "Content-Type: application/json" \
     --data "{data}" https://url.com --cacert ./my_crt > /dev/null 2>&1 &