Linux WGET 是否超时?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/2291524/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-03 19:46:52  来源:igfitidea点击:

Does WGET timeout?

linuxcronwget

提问by Click Upvote

I'm running a PHP script via cron using Wget, with the following command:

我正在使用 Wget 通过 cron 运行 PHP 脚本,命令如下:

wget -O - -q -t 1 http://www.example.com/cron/run

The script will take a maximum of 5-6 minutes to do its processing. Will WGet wait for it and give it all the time it needs, or will it time out?

该脚本最多需要 5-6 分钟来进行处理。WGet 会等待它并给它所有需要的时间,还是会超时?

采纳答案by Pascal MARTIN

According to the man page of wget, there are a couple of options related to timeouts -- and there is a default read timeout of 900s -- so I say that, yes, it could timeout.

根据wget 的手册页,有几个与超时相关的选项——默认读取超时为 900 秒——所以我说,是的,它可能会超时。


Here are the options in question :


以下是相关选项:

-T seconds
--timeout=seconds

Set the network timeout to seconds seconds. This is equivalent to specifying --dns-timeout, --connect-timeout, and --read-timeout, all at the same time.

将网络超时设置为秒秒。这相当于同时指定--dns-timeout, --connect-timeout, 和 --read-timeout, 。


And for those three options :


对于这三个选项:

--dns-timeout=seconds

Set the DNS lookup timeout to seconds seconds.
DNS lookups that don't complete within the specified time will fail.
By default, there is no timeout on DNS lookups, other than that implemented by system libraries.

将 DNS 查找超时设置为 seconds 秒。
未在指定时间内完成的 DNS 查找将失败。
默认情况下,除了系统库实现的超时之外,DNS 查找没有超时。

--connect-timeout=seconds

Set the connect timeout to seconds seconds.
TCP connections that take longer to establish will be aborted.
By default, there is no connect timeout, other than that implemented by system libraries.

将连接超时设置为秒秒。
建立时间较长的 TCP 连接将被中止。
默认情况下,除了系统库实现的连接超时之外,没有连接超时。

--read-timeout=seconds

Set the read (and write) timeout to seconds seconds.
The "time" of this timeout refers to idle time: if, at any point in the download, no data is received for more than the specified number of seconds, reading fails and the download is restarted.
This option does not directly affect the duration of the entire download.

将读取(和写入)超时设置为秒秒。
此超时的“时间”指的是空闲时间:如果在下载过程中的任何时间点没有收到数据超过指定的秒数,则读取失败并重新开始下载。
此选项不会直接影响整个下载的持续时间。


I suppose using something like


我想使用类似的东西

wget -O - -q -t 1 --timeout=600 http://www.example.com/cron/run

should make sure there is no timeout before longer than the duration of your script.

应该确保在超过脚本持续时间之前没有超时。

(Yeah, that's probably the most brutal solution possible ^^ )

(是的,这可能是最残酷的解决方案^^)

回答by hIpPy

The default timeout is 900 second. You can specify different timeout.

默认超时为 900 秒。您可以指定不同的超时时间。

-T seconds
--timeout=seconds

The default is to retry 20 times. You can specify different tries.

默认为重试 20 次。您可以指定不同的尝试。

-t number
--tries=number

link: wget man document

链接:wget man 文档

回答by Marco Demaio

Since in your question you said it's a PHP script, maybe the best solution could be to simply add in your script:

由于在您的问题中您说它是一个 PHP 脚本,因此最好的解决方案可能是简单地添加到您的脚本中:

ignore_user_abort(TRUE);

In this way even if wgetterminates, the PHP script goes on being processed at least until it does not exceeds max_execution_timelimit (ini directive: 30 seconds by default).

这样,即使wget终止,PHP 脚本也会继续被处理,直到它不超过max_execution_time限制(ini 指令:默认为 30 秒)。

As per wgetanyay you should not change its timeout, according to the UNIX manualthe default wget timeout is 900 seconds (15 minutes), whis is much larger that the 5-6 minutes you need.

按照wgetanyay,你不能改变它的超时,根据UNIX手册默认wget的超时时间为900秒(15分钟),WHIS要大得多,你需要5-6分钟。

回答by Dean Rather

Prior to version 1.14, wget timeout arguments were not adhered to if downloading over https due to a bug.

在 1.14 版之前,如果由于错误而通过 https 下载,则不遵守 wget 超时参数。