apache php/超时/连接到服务器重置?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/1610420/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-13 18:18:46  来源:igfitidea点击:

php/timeout/connection to server reset?

phpmysqlapachereset

提问by Mickey

I have a php script that needs to run for quite some time.

我有一个需要运行一段时间的 php 脚本。

What the script does:

脚本的作用:

  • connects to mysql
  • initiates anywhere from 100 to 100,000 cURL requests
  • each cURL request returns compact-decoded data of 1 to 2000 real estate listings - i use preg-match-all to get all the data and do one mysql insert per listing. each query never exceeds more than 1mb of data.
  • 连接到 mysql
  • 发起 100 到 100,000 个 cURL 请求
  • 每个 cURL 请求都返回 1 到 2000 个房地产列表的压缩解码数据 - 我使用 preg-match-all 来获取所有数据并对每个列表执行一个 mysql 插入。每个查询永远不会超过 1mb 的数据。

So there are a lot of loops, mysql inserts, and curl requests going on. php safe mode is off and I am able to successfully ini_set the max-execution-time to something ridiculous to allow my script to run all the way through.

所以有很多循环、mysql插入和curl请求在进行。php 安全模式关闭,我能够成功地将最大执行时间 ini_set 设置为荒谬的东西,以允许我的脚本一直运行。

Well, my problem is the script or apache or something is having a stroke in the middle of the script and the screen goes to the "connection to the server has been reset" screen.

好吧,我的问题是脚本或 apache 或脚本中间有一个笔划,屏幕转到“与服务器的连接已重置”屏幕。

Any ideas?

有任何想法吗?

采纳答案by Nathan Kleyn

Well, disregarding the fact that attempting 100,000 cURL requests is absolutely insane, you're probably hitting the memory limit.

好吧,不管尝试 100,000 个 cURL 请求绝对是疯狂的事实,您可能已经达到了内存限制。

Try setting the memory limit to something more reasonable:

尝试将内存限制设置为更合理的值:

ini_set('memory_limit', '256M');

And as a side tip, don't set the execution time to something ludicrous, chances are you'll eventually find a way to hit that with a script like this. ;]

作为一个附带提示,不要将执行时间设置为可笑的事情,很有可能您最终会找到一种方法来使用这样的脚本来实现它。;]

Instead, just set it to 0, it functionally equivalent to turning the execution limit off completely:

相反,只需将其设置为0,它在功能上等同于完全关闭执行限制:

ini_set('max_execution_time', 0);

回答by timdev

Lots of ideas:

很多想法:

1) Don't do it inside an HTTP request. Write a command-line php script to drive it. You can use a web-bound script to kick it off, if necessary.

1) 不要在 HTTP 请求中执行此操作。编写一个命令行 php 脚本来驱动它。如有必要,您可以使用 Web 绑定脚本来启动它。

2) You should be able to set max_execution_time to zero (or call set_time_limit(0)) to ensure you don't get shut down for exceeding a time limit

2) 您应该能够将 max_execution_time 设置为零(或调用 set_time_limit(0))以确保您不会因超过时间限制而被关闭

3) It sounds like you really want to refactor this into a something more sane. Think about setting up a little job queueing system, and having a php script that forks several children to chew through all the work.

3)听起来您真的想将其重构为更理智的东西。考虑建立一个小的作业排队系统,并拥有一个 php 脚本,让几个孩子分叉完成所有的工作。

As Josh says, look at your error_log and see why you're being shut down right now. Try to figure out how much memory you're using -- that could be a problem. Try setting the max_execution_time to zero. Maybe that will get you where you need to be quickly.

正如 Josh 所说,查看您的 error_log 并了解您现在被关闭的原因。尝试弄清楚您使用了多少内存——这可能是一个问题。尝试将 max_execution_time 设置为零。也许这会让你快速到达你需要的地方。

But in the long run, it sounds like you've got way too much work to do inside of one http request. Take it out of http, and divide and conquer!

但从长远来看,听起来您在一个 http 请求中要做的工作太多了。把它从http中取出来,分而治之!

回答by ChronoFish

You can set the timeout to be indefinate by modifying your PHP.ini and setting the script execution variable.

您可以通过修改 PHP.ini 并设置脚本执行变量来将超时设置为不确定。

But you may also want to consider a slight architecture change. First consider a "Launch and forget" approach at getting 100,000 curl requests. Second, consider using "wget" instead of curl.

但是您可能还需要考虑对架构进行轻微的更改。首先考虑在获得 100,000 个卷曲请求时采用“启动并忘记”的方法。其次,考虑使用“wget”而不是 curl。

You can issue a simple "wget URL -o UniqueFileName &" This will retrieve a web page, save it to a "unique" filename and all in the background.

您可以发出一个简单的“ wget URL -o UniqueFileName &” 这将检索网页,将其保存为“唯一”文件名并全部在后台进行。

Then you can iterate over a directory of files, greping (preg_matching) data, and making your DB calls. Move the files as you process them to an archive and continue to iterate until there are no more files.

然后,您可以遍历文件目录、greping (preg_matching) 数据并进行数据库调用。在处理文件时将文件移动到存档中并继续迭代,直到没有更多文件。

Think of the directory as a "queue" and have one process just process the files. Have a second process simply go out and grab web-page data. You could add a third process that can be you "monitor" which works independently and simply reports snap-shot statistics. The other two can just be "web services" with no interface.

将目录视为一个“队列”,并让一个进程只处理文件。有一个第二个过程,只需出去抓取网页数据。您可以添加第三个进程,它可以作为您的“监视器”,它独立工作并简单地报告快照统计信息。另外两个可以只是没有接口的“网络服务”。

This type of multi-threading is really powerful and greatly under-utilized IMHO. To me this is the true power of the web.

恕我直言,这种类型的多线程非常强大,但未充分利用。对我来说,这就是网络的真正力量。

回答by MoR

I had the same problem when getting data from MySQL via PHP that contained special characters like umlauts ?,?,ü, ampersands etc. The connection was reset and I found no errors in either the apache log nor the php logs. First I made sure in PHP that I accessed the characters set on the DB correctly with:

我在通过 PHP 从 MySQL 获取数据时遇到了同样的问题,这些数据包含特殊字符,如变音符号?、?、ü、&符号等。连接被重置,我在 apache 日志和 php 日志中都没有发现错误。首先,我确保在 PHP 中我正确访问了数据库上设置的字符:

mysql_query("SET NAMES 'latin1' COLLATE 'latin1_german2_ci'");

mysql_query("SET CHARACTER SET 'latin1'");

Then, finally, I resolved the problem with this line in PHP:

mysql_query("SET character_set_connection='latin1'");

回答by Josh

What's in the apache error_log? Are you reaching the memory limit?

apache error_log 中有什么?您是否达到内存限制?

EDIT: Looks like you are reaching your memory limit. Do you have access to PHP.ini? If so, you can raise the memory_limit there. If not, try running curl or wget binaries using the execor shell_execfunctions, that way they run as separate processes, not using PHP's memory.

编辑:看起来您已达到内存限制。您可以访问 PHP.ini 吗?如果是这样,您可以在那里提高 memory_limit。如果没有,请尝试使用execshell_exec函数运行 curl 或 wget 二进制文件,这样它们就可以作为单独的进程运行,而不是使用 PHP 的内存。

回答by Byron Whitlock

100,000 cURL requests??? You are insane. Break that data up!

100,000 个 cURL 请求???你疯了。分解这些数据!