增加 PHP memory_limit。什么时候会变得疯狂?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/1425138/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Increasing PHP memory_limit. At what point does it become insane?
提问by Brenton Alker
In a system I am currently working on, there is one process that loads large amount of data into an array for sorting/aggregating/whatever. I know this process needs optimising for memory usage, but in the short term it just needs to work.
在我目前正在研究的系统中,有一个进程将大量数据加载到数组中以进行排序/聚合/任何操作。我知道这个过程需要优化内存使用,但在短期内它只需要工作。
Given the amount of data loaded into the array, we keep hitting the memory limit. It has been increased several times, and I am wondering is there a point where increasing it becomes generally a bad idea? or is it onlya matter of how much RAM the machine has?
鉴于加载到数组中的数据量,我们不断达到内存限制。它已经增加了好几次,我想知道增加它是否有一点通常是个坏主意?或者这只是机器有多少内存的问题?
The machine has 2GB of RAM and the memory_limit is currently set at 1.5GB. We can easily add more RAM to the machine (and will anyway).
该机器有 2GB 的 RAM,memory_limit 目前设置为 1.5GB。我们可以轻松地向机器添加更多 RAM(无论如何都会)。
Have others encountered this kind of issue? and what were the solutions?
有其他人遇到过这种问题吗?解决方案是什么?
回答by Pascal MARTIN
The configuration for the memory_limitof PHP running as an Apache module to server webpages has to take into consideration how many Apache process you can have at the same time on the machine -- see the MaxClientsconfiguration option for Apache.
在配置memory_limitPHP运行作为Apache模块到服务器的网页必须考虑到有多少Apache进程,你可以在同一时间在机器上-看到的MaxClientsApache配置选项。
If MaxClientsis 100 and you have 2,000 MB of RAM, a very quick calculation will show that you should not use more than 20 MB *(because 20 MB * 100 clients = 2 GB or RAM, ie the total amount of memory your server has)* for the memory_limit value.
如果MaxClients是 100 并且您有 2,000 MB 的 RAM,则非常快速的计算将表明您不应使用超过 20 MB *(因为 20 MB * 100 个客户端 = 2 GB 或 RAM,即您的服务器拥有的总内存量) * 为 memory_limit 值。
And this is without considering that there are probably other things running on the same server, like MySQL, the system itself, ... And that Apache is probably already using some memory for itself.
这还没有考虑到可能在同一台服务器上运行其他东西,比如 MySQL,系统本身,......而且 Apache 可能已经在为自己使用一些内存。
Or course, this is also a "worst case scenario", that considers that each PHP page is using the maximum amount of memory it can.
或者当然,这也是一个“最坏的情况”,它认为每个 PHP 页面都在使用它可以使用的最大内存量。
In your case, if you need such a big amount of memory for only one job, I would not increase the memory_limitfor P?P running as an Apache module.
在您的情况下,如果您只需要为一项工作提供如此大的内存,我不会增加memory_limit作为 Apache 模块运行的 P?P。
Instead, I would launch that job from command-line (or via a cron job), and specify a higher memory_limitspecificaly in this one and only case.
相反,我会从命令行(或通过 cron 作业)启动该作业,并memory_limit在这种且唯一的情况下指定更高的特定值。
This can be done with the -doption of php, like :
这可以通过-dphp 选项来完成,例如:
$ php -d memory_limit=1GB temp.php
string(3) "1GB"
Considering, in this case, that temp.php only contains :
考虑到,在这种情况下,temp.php 仅包含:
var_dump(ini_get('memory_limit'));
In my opinion, this is way safer than increasing the memory_limit for the PHP module for Apache -- and it's what I usually do when I have a large dataset, or some really heavy stuff I cannot optimize or paginate.
在我看来,这比为 Apache 增加 PHP 模块的 memory_limit 安全得多——而且当我有一个大数据集或者一些我无法优化或分页的非常重的东西时,我通常会这样做。
If you need to define several values for the PHP CLI execution, you can also tell it to use another configuration file, instead of the default php.ini, with the -coption :
如果您需要为 PHP CLI 执行定义多个值,您还可以通过以下选项告诉它使用另一个配置文件,而不是默认的 php.ini -c:
php -c /etc/phpcli.ini temp.php
That way, you have :
这样,你有:
/etc/php.inifor Apache, with lowmemory_limit, lowmax_execution_time, ...- and
/etc/phpcli.inifor batches run from command-line, with virtually no limit
/etc/php.ini对于 Apache,具有低memory_limit、低max_execution_time、...- 和
/etc/phpcli.ini用于批量从命令行运行,几乎没有限制
This ensures your batches will be able to run -- and you'll still have security for your website (memory_limitand max_execution_timebeing security measures)
这可确保您的批次能够运行——并且您的网站仍然具有安全性(memory_limit并max_execution_time采取安全措施)
Still, if you have the time to optimize your script, you should ; for instance, in that kind of situation where you have to deal with lots of data, pagination is a must-have ;-)
尽管如此,如果您有时间优化您的脚本,您应该;例如,在那种你必须处理大量数据的情况下,分页是必须的;-)
回答by arul
Have you tried splitting the dataset into smaller parts and process only one part at the time?
您是否尝试过将数据集拆分为更小的部分并一次只处理一部分?
If you fetch the data from a disk file, you can use the fread()function to load smaller chunks, or some sort of unbuffered db queryin case of database.
如果您从磁盘文件中获取数据,则可以使用该fread()函数加载较小的块,或者在数据库的情况下使用某种无缓冲的 db 查询。
I haven't checked up PHP since v3.something, but you also could use a form of cloud computing. 1GB dataset seems to be big enough to be processed on multiple machines.
从 v3.something 开始我就没有检查过 PHP,但您也可以使用一种云计算形式。1GB 数据集似乎足够大,可以在多台机器上处理。
回答by zombat
Given that you know that there are memory issues with your script that need fixing and you are only looking for short-term solutions, then I won't address the ways to go about profilingand solving your memory issues. It sounds like you're going to get to that.
鉴于您知道您的脚本存在内存问题需要修复,而您只是在寻找短期解决方案,那么我不会介绍分析和解决内存问题的方法。听起来你会做到这一点。
So, I would say the main things you have to keep in mind are:
所以,我想说你必须记住的主要事情是:
- Total memory load on the system
- OS capabilities
- 系统上的总内存负载
- 操作系统能力
PHP is only one small component of the system. If you allow it to eat up a vast quantity of your RAM, then the other processes will suffer, which could in turn affect the script itself. Notably, if you are pulling a lot of data out of a database, then your DBMS might be require a lot of memory in order to create result sets for your queries. As a quick fix, you might want to identify any queries you are running and free the results as soon as possible to give yourself more memory for a long job run.
PHP 只是系统的一个小组件。如果允许它占用大量 RAM,则其他进程将受到影响,进而可能影响脚本本身。值得注意的是,如果您从数据库中提取大量数据,那么您的 DBMS 可能需要大量内存才能为您的查询创建结果集。作为快速修复,您可能希望识别正在运行的任何查询并尽快释放结果,以便为长时间的作业运行留出更多内存。
In terms of OS capabilities, you should keep in mind that 32-bit systems, which you are likely running on, can only address up to 4GB of RAM without special handling. Often the limit can be much less depending on how it's used. Some Windows chipsets and configurations can actually have less than 3GB available to the system, even with 4GB or more physically installed. You should check to see how much your system can address.
在操作系统功能方面,您应该记住,您可能运行的 32 位系统最多只能处理 4GB 的 RAM,而无需特殊处理。通常,限制可能会小得多,具体取决于它的使用方式。某些 Windows 芯片组和配置实际上可能只有不到 3GB 可供系统使用,即使物理安装了 4GB 或更多。您应该检查一下您的系统可以处理多少。
You say that you've increased the memory limit several times, so obviously this job is growing larger and larger in scope. If you're up to 1.5Gb, then even installing 2Gb more RAM sounds like it will just be a short reprieve.
你说你已经多次增加了内存限制,所以很明显这个工作的范围越来越大。如果您的内存高达 1.5Gb,那么即使再安装 2Gb 的 RAM 听起来也只是一个短暂的缓刑。
Have others encountered this kind of issue? and what were the solutions?
有其他人遇到过这种问题吗?解决方案是什么?
I think you probably already know that the only real solution is to break down and spend the time to optimize the script soon, or you'll end up with a job that will be too big to run.
我想你可能已经知道唯一真正的解决方案是尽快分解并花时间优化脚本,否则你最终会得到一个太大而无法运行的工作。

