如何修复 PHP 中的内存泄漏
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/1010402/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to go about fixing a memory leak in PHP
提问by thomasrutter
My PHP app has an import script that can import records.
我的 PHP 应用程序有一个可以导入记录的导入脚本。
At the moment, it is importing from a CSV file. It is reading each line of the CSV file, one line at a time using fgetcsv, and for each line it is doing a lotof processing on that record, including database queries, and then moving on to the next line. It shouldn't need to keep accumulating more memory.
目前,它正在从 CSV 文件导入。它正在读取 CSV 文件的每一行,使用 fgetcsv 一次一行,并且对于每一行,它对该记录进行大量处理,包括数据库查询,然后移至下一行。它不应该需要不断积累更多的内存。
After around 2500 records imported, PHP dies, saying that it has run over its memory limit (132 MB or so).
导入大约 2500 条记录后,PHP 死机,说它已超出其内存限制(132 MB 左右)。
The CSV file itself is only a couple of megs - the other processing that happens does a lot of string comparisons, diffs, etc. I have a huge amount of code operating on it and it would be difficult to come up with a 'smallest reproducing sample'.
CSV 文件本身只有几兆 - 发生的其他处理会进行大量字符串比较、差异等。我有大量的代码在上面运行,很难想出一个“最小的复制”样本'。
What are some good ways to go about finding and fixing such a problem?
有什么好的方法可以找到并解决此类问题?
Cause of problem found
发现问题原因
I have a debug class which logs all my database queries during runtime. So those strings of SQL, some 30KB long, were staying in memory. I realise this isn't suitable for scripts designed to run for a long time.
我有一个调试类,它在运行时记录我所有的数据库查询。所以那些大约 30KB 长的 SQL 字符串留在了内存中。我意识到这不适合设计为长时间运行的脚本。
There may be other sources of memory leaks, but I am fairly sure this is the cause of my problem.
可能还有其他内存泄漏源,但我很确定这是我的问题的原因。
采纳答案by lpfavreau
It would help to have a look at the code but if you want to debug it yourself, have a look at Xdebug, it'll help profile your application.
查看代码会有所帮助,但如果您想自己调试它,请查看Xdebug,它将有助于分析您的应用程序。
Of course, depending on what you are doing, it is possible it's accumulating some memory, although 132MB seems already high for 2500 records. Of course, you can tweak your memory limitin php.ini if needed.
当然,根据您在做什么,它可能会积累一些内存,尽管 132MB 对于 2500 条记录来说似乎已经很高了。当然,如果需要,您可以在 php.ini 中调整内存限制。
How big is the CSV file you are reading? And what objects and kind of processing are you doing to it?
您正在阅读的 CSV 文件有多大?你对它做了什么对象和什么样的处理?
回答by too much php
If you do in fact suspect that there are just one or two memory leaks in your script which are causing it to crash, then you should take the following steps:
如果您确实怀疑脚本中只有一两个内存泄漏导致它崩溃,那么您应该采取以下步骤:
- Change
memory_limitto something small, like 500KB - Comment out all but one of the processing steps which is applied to each row.
- Run the limited processing over the whole CSV file and see if it can complete.
- Gradually add more steps back in and watch to see if memory usage spikes.
- 更改
memory_limit到小东西,比如500KB - 注释掉除了应用于每一行的一个处理步骤之外的所有步骤。
- 对整个 CSV 文件运行有限的处理,看看它是否可以完成。
- 逐渐添加更多步骤并观察内存使用情况是否出现峰值。
Example:
例子:
ini_set('memory_limit', 1024 * 500);
$fp = fopen("test.csv", 'r');
while($row = fgetcsv($fp)) {
validate_row($row); // step 1: validate
// add these back in one by one and keep an eye on memory usage
//calculate_fizz($row); // step 2: fizz
//calculate_buzz($row); // step 3: buzz
//triangulate($row); // step 4: triangulate
}
echo "Memory used: ", memory_get_peak_usage(), "\n";
The worst case scenario is that allof your processing steps are moderately inefficient and you will need to optimize all of them.
最坏的情况是所有处理步骤的效率都比较低,您需要对所有步骤进行优化。
回答by stefs
you could try a local installation of php5.3 and call http://www.php.net/manual/en/function.gc-collect-cycles.php.
您可以尝试在本地安装 php5.3 并调用 http://www.php.net/manual/en/function.gc-collect-cycles.php。
gc_collect_cycles— Forces collection of any existing garbage cycles
gc_collect_cycles— 强制收集任何现有的垃圾循环
if the situation improves, you at least verified (on of) the problem(s).
如果情况有所改善,您至少验证了问题。
回答by Vinko Vrsalovic
It depends on how are you clearing the variables after being done with them.
这取决于您在处理完变量后如何清除它们。
It looks like you are done with the record but you are still storing the information somewhere. Use unset()to clear variables up if in doubt.
看起来您已经完成了记录,但您仍在某处存储信息。如果有疑问,请使用unset()清除变量。
Please provide a minimal reproducing code sample to see where is all that memory going if this doesn't help.
如果这没有帮助,请提供一个最小的复制代码示例,以查看所有内存都去哪里了。
BTW, producing the smallest code sample that will reproduce the problem is a great debugging technique because it forces you to go through the code again, with care.
顺便说一句,生成将重现问题的最小代码示例是一种很好的调试技术,因为它迫使您再次仔细检查代码。
回答by UnkwnTech
How are you reading the file? If your using fread/filegetcontents or other such functions then you are going to consume the entire file size (or however much you load with fread) in memory as the entire file is loaded at call time. However if you use fgetcsvif will only read one line at a time depending on the length of the line this can be dramaticly easier on your memory.
你是如何读取文件的?如果您使用 fread/filegetcontents 或其他此类函数,那么您将在内存中消耗整个文件大小(或使用 fread 加载多少),因为在调用时加载整个文件。但是,如果您使用fgetcsvif 一次只能读取一行,具体取决于行的长度,这会大大简化您的记忆。
Also make sure that you are reusing as many variables as possible on each loop. Check that there are no array with large amounts of data in them.
还要确保在每个循环中重复使用尽可能多的变量。检查其中没有包含大量数据的数组。
As a last note also make sure that you are opening your file before your loop then closing it afterwords:
最后一点,还要确保在循环之前打开文件,然后在后记中关闭它:
$fh = fopen(...);
while(true)
{
//...
}
fclose($fh);
You don't realy want to be doing this:
你真的不想这样做:
while(true)
{
$fh = fopen(...);
//...
fclose($fh);
}
And like others have said it'll be hard to tell without seeing some code.
就像其他人所说的那样,如果没有看到一些代码,就很难分辨。
回答by alex
Are you able to change your memory_limit in your php.ini?
您可以在 php.ini 中更改 memory_limit 吗?
Also, could doing unset($var) on variables free up some memory? Could $var = null help too?
另外,对变量执行 unset($var) 可以释放一些内存吗?$var = null 也有帮助吗?
See also this question: What's better at freeing memory with PHP: unset() or $var = null
另见这个问题:What's better at free memory with PHP: unset() or $var = null
回答by Jani Hartikainen
It's difficult to say the cause without seeing any code. However, a typical issue is recursive references, ie. object A points to object B and the other way around, which may cause the GC to screw up.
不看任何代码就很难说出原因。然而,一个典型的问题是递归引用,即。对象 A 指向对象 B,反之,这可能会导致 GC 搞砸。
I don't know how you're currently processing the file, but you could attempt to only read the file one row at a time. If you read the whole file at once it may consume more memory.
我不知道您目前如何处理文件,但您可以尝试一次只读取一行文件。如果您一次读取整个文件,它可能会消耗更多内存。
This is actually one of the reasons I often prefer Python for batch processing tasks.
这实际上是我经常更喜欢 Python 进行批处理任务的原因之一。
回答by captainclam
I was having the same problem, and it was also due to database profiling (Zend_Db_Profiler_Firebug). In my case it was leaking 1mb per minute. this script was supposed to run for days, so it would crash within a few hours.
我遇到了同样的问题,这也是由于数据库分析 (Zend_Db_Profiler_Firebug)。就我而言,它每分钟泄漏 1mb。这个脚本应该运行几天,所以它会在几个小时内崩溃。

