php 关闭连接后继续处理
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/4806637/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Continue processing after closing connection
提问by Endophage
Is there a way in PHP to close the connection (essentially tell a browser than there's no more data to come) but continue processing. The specific circumstance I'm thinking of is that I would want to serve up cached data, then if the cache had expired, I would still serve the cached data for a fast response, close the connection, but continue processing to regenerate and cache new data. Essentially the only purpose is to make a site appear more responsive as there wouldn't be the occasional delay while a user waits for content to be regenerated.
PHP 中有没有办法关闭连接(基本上告诉浏览器没有更多数据)但继续处理。我在想的具体情况是我想提供缓存数据,然后如果缓存已经过期,我仍然会提供缓存数据以获得快速响应,关闭连接,但继续处理以重新生成并缓存新的数据。从本质上讲,唯一的目的是使站点看起来更具响应性,因为在用户等待内容重新生成时不会偶尔出现延迟。
UPDATE:
更新:
PLuS has the closest answer to what I was looking for. To clarify for a couple of people I'm looking for something that enables the following steps:
PLuS 有最接近我想要的答案。为了向几个人澄清,我正在寻找能够执行以下步骤的东西:
- User requests page
- Connection opens to server
- PHP checks if cache has expired, if still fresh, serve cache and close connection (END HERE). If expired, continue to 4.
- Serve expired cache
- Close connection so browser knows it's not waiting for more data.
- PHP regenerates fresh data and caches it.
- PHP shuts down.
- 用户请求页面
- 连接打开到服务器
- PHP 检查缓存是否已过期,如果仍然是新鲜的,则提供缓存并关闭连接(在此结束)。如果已过期,请继续执行 4。
- 提供过期的缓存
- 关闭连接,以便浏览器知道它不会等待更多数据。
- PHP 重新生成新数据并缓存它。
- PHP 关闭。
UPDATE:
更新:
This is important, it must be a purely PHP solution. Installing other software is not an option.
这很重要,它必须是一个纯粹的 PHP 解决方案。安装其他软件不是一种选择。
采纳答案by Endophage
I finally found a solution (thanks to Google, I just had to keep trying different combinations of search terms). Thanks to the comment from arr1 on this page(it's about two thirds of the way down the page).
我终于找到了一个解决方案(多亏了谷歌,我只需要不断尝试不同的搜索词组合)。感谢 arr1 在此页面上的评论(大约在页面下方的三分之二处)。
<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(true);
ob_start();
echo 'Text the user will see';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // All output buffers must be flushed here
flush(); // Force output to client
// Do processing here
sleep(30);
echo('Text user will never see');
I have yet to actually test this but, in short, you send two headers: one that tells the browser exactly how much data to expect then one to tell the browser to close the connection (which it will only do after receiving the expected amount of content). I haven't tested this yet.
我还没有对此进行实际测试,但简而言之,您发送了两个标头:一个告诉浏览器确切需要多少数据,另一个告诉浏览器关闭连接(只有在收到预期数量的数据后才会这样做)内容)。我还没有测试过这个。
回答by Adam Jimenez
If running under fastcgi you can use the very nifty:
如果在 fastcgi 下运行,您可以使用非常漂亮的:
fastcgi_finish_request();
fastcgi_finish_request();
http://php.net/manual/en/function.fastcgi-finish-request.php
http://php.net/manual/en/function.fastcgi-finish-request.php
More detailed information is available in a duplicate answer.
更详细的信息可在重复的答案中找到。
回答by PLuS
You can do that by setting time limit to unlimited and ignoring connection
您可以通过将时间限制设置为无限制并忽略连接来做到这一点
<?php
ignore_user_abort(true);
set_time_limit(0);
see also: http://www.php.net/manual/en/features.connection-handling.php
另见:http: //www.php.net/manual/en/features.connection-handling.php
回答by R. van Twisk
PHP doesn't have such persistence (by default). The only way I can think of is run cron jobs to pre-fill the cache.
PHP 没有这样的持久性(默认情况下)。我能想到的唯一方法是运行 cron 作业来预填充缓存。
回答by Alfred
Can compile and run programs from PHP-CLI(not on shared hosting > VPS)
可以从 PHP-CLI 编译和运行程序(不在共享主机 > VPS 上)
Caching
缓存
For caching I would not do it that way. I would use redis as my LRU cache. It is going to be very fast(benchmarks)especially when you compile it with client library written in C.
对于缓存,我不会那样做。我会使用 redis 作为我的LRU 缓存。它会非常快(基准测试),尤其是当您使用C 编写的客户端库编译它时。
Offline processing
离线处理
When you install beanstalkdmessage queue you can also do delayed puts. But I would use redis brpop/rpushto do the other message queuing part because redis is going to be faster especially if you use PHP client library(in C user-space).
当您安装beanstalkd消息队列时,您还可以进行延迟放置。但是我会使用 redis brpop/ rpush来做其他消息队列部分,因为 redis 会更快,尤其是如果您使用 PHP 客户端库(在 C 用户空间中)。
Can NOT compile or run programs from PHP-CLI(on shared hosting)
无法从 PHP-CLI 编译或运行程序(在共享主机上)
set_time_limit
设置时间限制
most of the times this set_time_limitis not available(because of safe-mode or max_execution_time
directive) to set 0 at least when on shared hosting.Also shared hosting really providers don't like for users to hold up PHP processes for a long time. Most of the times the default limit is set to 30.
大多数情况下,此set_time_limit不可用(由于安全模式或max_execution_time
指令)至少在共享主机上设置 0。此外,共享主机确实不喜欢用户长时间阻止 PHP 进程。大多数情况下,默认限制设置为 30。
Cron
定时任务
Use cron to write data to disc using Cache_lite. Some stackoverflow topic already explaining this:
使用 cron 使用Cache_lite将数据写入磁盘。一些stackoverflow主题已经解释了这一点:
- crontab with wget - why is it running twice?
- Bash commands not executed when through cron job - PHP
- How can I debug a PHP CRON script that does not appear to be running?
Also rather easy, but still hacky. I thinky you should upgrade(>VPS) when you have to do such hacking.
也相当容易,但仍然很hacky。我认为当您必须进行此类黑客攻击时,您应该升级(> VPS)。
Asynchronous request
异步请求
As last resort you could do asynchronous requestcaching data using Cache_lite for example. Be awarethat shared hosting does not like for you to hold up a lot of long running PHP processes. I would use only one background process which calls another one when it reaches max-execution-time
directive. I would note time when script starts and between a couple of cache calls I would measure time spent and when it gets near the time I would do another asynchronous request. I would use lockingto make sure only 1 process is running. This way I will not piss of the provider and it can be done. On the other hand I don't think I would write any of this because it is kind of hacky if you ask me. When I get to that scale I would upgrade to VPS.
作为最后的手段,您可以使用 Cache_lite 进行异步请求缓存数据,例如。请注意,共享主机不希望您阻止大量长时间运行的 PHP 进程。我只会使用一个后台进程,当它到达max-execution-time
指令时调用另一个进程。我会记录脚本启动的时间以及在几次缓存调用之间的时间,我会测量花费的时间,以及当它接近我将执行另一个异步请求的时间时。我会使用锁定来确保只有 1 个进程正在运行。这样我就不会惹怒供应商,而且可以做到。另一方面,我不认为我会写任何这些,因为如果你问我,这有点不合时宜。当我达到那个规模时,我会升级到 VPS。
回答by Septagram
As far as I know, unless you're running FastCGI, you can't drop the connection and continue execution (unless you got Endophage's answer to work, which I failed). So you can:
据我所知,除非你正在运行 FastCGI,否则你不能断开连接并继续执行(除非你得到 Endophage 的工作答案,但我失败了)。这样你就可以:
- Use cron or anything like that to schedule this kind of tasks
- Use a child process to finish the job
- 使用 cron 或类似的东西来安排此类任务
- 使用子进程完成工作
But it gets worse. Even if you spawn a child process with proc_open()
, PHP will wait for it to finish before closing connection, even after calling exit()
, die()
, some_undefined_function_causing_fatal_error()
. The only workaround I found is to spawn a child process that itself spawns a child process, like this:
但情况会变得更糟。即使您使用 生成子进程proc_open()
,PHP 也会在关闭连接之前等待它完成,即使在调用exit()
, die()
, 之后也是如此some_undefined_function_causing_fatal_error()
。我发现的唯一解决方法是生成一个子进程,它本身会生成一个子进程,如下所示:
function doInBackground ($_variables, $_code)
{
proc_open (
'php -r ' .
escapeshellarg ("if (pcntl_fork() === 0) { extract (unserialize ($argv [1])); $_code }") .
' ' . escapeshellarg (serialize ($_variables)),
array(), $pipes
);
}
$message = 'Hello world!';
$filename = tempnam (sys_get_temp_dir(), 'php_test_workaround');
$delay = 10;
doInBackground (compact ('message', 'filename', 'delay'), <<< 'THE_NOWDOC_STRING'
// Your actual code goes here:
sleep ($delay);
file_put_contents ($filename, $message);
THE_NOWDOC_STRING
);
回答by Justin Ethier
回答by Lightness Races in Orbit
No. As far as the webserver is concerned, the request from the browser is handled by the PHP engine, and that's that. The request lasts as long as the PHP.
不。就网络服务器而言,来自浏览器的请求由 PHP 引擎处理,仅此而已。请求持续时间与 PHP 一样长。
You might be able to fork()
though.
你也许可以fork()
。