php 允许的内存大小为 134217728 字节耗尽(尝试分配 4294967296 字节)
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/18121619/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Allowed memory size of 134217728 bytes exhausted (tried to allocate 4294967296 bytes)
提问by user2637147
My project uses an open source PHP MySQL library https://github.com/ajillion/PHP-MySQLi-Database-Class
我的项目使用了一个开源的 PHP MySQL 库https://github.com/ajillion/PHP-MySQLi-Database-Class
But the project mid-year report: "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4294967296 bytes) in / home1/flipalbu/public_html/kvisofttest/login-admin/Lib/class.MysqliDb.php on line 422" This error ,
但是项目年中报告:“致命错误:在/home1/flipalbu/public_html/kvisofttest/login-admin/Lib/class.MysqliDb.php on line 422 中允许内存大小为 134217728 字节耗尽(试图分配 4294967296 字节) “这个错误,
My server is: linux x86_64
我的服务器是:linux x86_64
PHP Version 5.4.17
PHP 版本 5.4.17
Mysql Version: 5.5.32
Mysql 版本:5.5.32
memory_limit = 128M
内存限制 = 128M
Line 422:call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);
第 422 行:call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);
Query part of the code:
查询部分代码:
$ db = new MysqliDb ('LocalHost', 'root', 'PASSWD', 'DB');
$ wqdb = $ db-> query ("SELECT * FROM db_table");
foreach ($ wqdb as $ row) {
???? $ con. = $ row ['ID'];
}
echo $ con;
Is there any way to solve it?
有什么办法可以解决吗?
/** Error Code **/
/** 错误代码 **/
protected function _dynamicBindResults(mysqli_stmt $stmt)
{
$parameters = array();
$results = array();
$meta = $stmt->result_metadata();
$row = array();
while ($field = $meta->fetch_field()) {
$row[$field->name] = null;
$parameters[] = & $row[$field->name];
}
call_user_func_array(array($stmt, 'bind_result'), $parameters);
while ($stmt->fetch()) {
$x = array();
foreach ($row as $key => $val) {
$x[$key] = $val;
}
array_push($results, $x);
}
return $results;
}
回答by x4rf41
I read this bug report here: https://bugs.php.net/bug.php?id=51386
我在这里阅读了这个错误报告:https: //bugs.php.net/bug.php?id=51386
Your problem seems to happen because there is a longblob
or longtext
in the columns of the table.
您的问题似乎是因为表的列中有longblob
或longtext
。
longtext
/ longblob
have a maximum length of 4294967295
[4GB] thats why mysqli tries to allocated that memory for the buffer to be sure nothing is lost. I would suggest that you use mediumtext
(16777215 [16MB] max length), that should be enough for everything usually.
longtext
/longblob
的最大长度为4294967295
[4GB] 这就是为什么 mysqli 尝试为缓冲区分配该内存以确保不会丢失任何内容。我建议您使用mediumtext
(16777215 [16MB] 最大长度),这通常足以满足所有要求。
Update:Because this answer has seen some activity I add this solution from Phil_1984 (see comments)
更新:因为这个答案已经看到了一些活动,所以我从 Phil_1984 中添加了这个解决方案(见评论)
I use mysqli and after reading that quote from php dev, adding a $stmt->store_result(); between execute and bind_result seems to fix the issues for me
我使用 mysqli 并在阅读了 php dev 的引用后,添加了一个 $stmt->store_result(); execute 和 bind_result 之间似乎为我解决了问题
=> If you use $stmt->store_result()
you can use mysqli with longblob
/ longtext
without getting the error.
=> 如果您使用,$stmt->store_result()
您可以将 mysqli 与longblob
/一起使用longtext
而不会出现错误。
-
——
Old Answer:I suggest that you either change the column to another type (mediumtext) or use PDO (i think it doesnt have that problem). but if you want to keep the column as longtext, you have to switch your mysql library
旧答案:我建议您将列更改为另一种类型(mediumtext)或使用 PDO(我认为它没有那个问题)。但是如果你想把列保持为longtext,你必须切换你的mysql库
Quote from PHP Dev:
来自 PHP Dev 的引用:
This is a known limitation of ext/mysqli when using libmysql (always in 5.2 and previous) and when libmysql is enabled with 5.3 . The reason is that the server sends not too specific metadata about the column. This longtext has a max length of 4G and ext/mysqli tries to bind with the max length, to be sure no data loss occurs (data doesn't fit in the bind buffer on C level). However, that means 4G for a longtext/longblob column. ext/mysqli has been changed to have a way to work around that. You need to call mysqli_stmt_store_result() which will store the data locally, which means, of course a higher memory usage for PHP. However, because you use libmysql this won't hit the PHP's memory limit, for sure. During store_result the max_length of every column will be calculated and then when bind_result is executed only a buffer with size of max_length will be allocated, which will be definitely lower than 4G. In short, prepare execute store_result bind_result fetch...fetch...fetch
这是 ext/mysqli 在使用 libmysql(始终在 5.2 及之前版本)以及使用 5.3 启用 libmysql 时的已知限制。原因是服务器发送的关于列的元数据不太具体。这个 longtext 的最大长度为 4G 并且 ext/mysqli 尝试使用最大长度进行绑定,以确保不会发生数据丢失(数据不适合 C 级别的绑定缓冲区)。但是,这意味着 longtext/longblob 列需要 4G。ext/mysqli 已更改为有办法解决该问题。您需要调用 mysqli_stmt_store_result() ,它将在本地存储数据,这当然意味着 PHP 的内存使用量更高。但是,因为您使用 libmysql,所以这肯定不会达到 PHP 的内存限制。store_result的时候会计算每一列的max_length,然后bind_result执行的时候只分配一个大小为max_length的buffer,肯定小于4G。总之,准备执行 store_result bind_result fetch...fetch...fetch
回答by GordonM
If you're trying to read in an entire table in one go, and that table has a lot of rows and columns then running out of memory is inevitable. You can put it off by upping the memory limit in php.ini but the problem will only reoccur when you add a few thousand more rows.
如果您尝试一次性读取整个表,并且该表有很多行和列,那么内存不足是不可避免的。您可以通过提高 php.ini 中的内存限制来推迟它,但只有当您再添加几千行时,问题才会再次发生。
You need to rewrite your script to be more sensible about what it fetches. If you only need specific records then pulling down the entire table and looking for the row(s) you want in the result set is horrible inefficient. Use a WHERE clause to specify what you really want to get. The rule of thumb with PHP/SQL applications is "Use SQL wherever possible to specify what you want, then do what you need to do with it in PHP".
您需要重写您的脚本,以便更了解它所获取的内容。如果您只需要特定的记录,那么拉下整个表并在结果集中查找您想要的行是非常低效的。使用 WHERE 子句来指定您真正想要获得的内容。PHP/SQL 应用程序的经验法则是“尽可能使用 SQL 指定您想要的内容,然后在 PHP 中使用它做您需要做的事情”。
Of course it may be that there's an entirely legitimate reason why you need to process an entire table in PHP. In that case, you should fetch the data in chunks (say 100 rows at a time) with LIMIT and OFFSET, process those rows, get the next chunk, process those and so on until you've gone through the entire table. That will far less memory than trying to load the entire table at once
当然,您需要在 PHP 中处理整个表的原因可能是完全合理的。在这种情况下,您应该使用 LIMIT 和 OFFSET 以块的形式(比如一次 100 行)获取数据,处理这些行,获取下一个块,处理它们等等,直到您浏览了整个表。这将比尝试一次加载整个表少得多
回答by Patrick Br?samle
Doens't seem like a huge table! Seems like a endless loop! It tries to allocate about 4gb, i dont think you have such a big table....
看起来不像一张大桌子!好像无限循环!它试图分配大约 4GB,我不认为你有这么大的桌子......
check that you don't create a loop here:
检查您没有在此处创建循环:
call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);
maybe you should post the code that is around this line.
也许你应该发布围绕这一行的代码。
回答by vicentazo
You are exceeding the maximun available memory. Yow have two options:
您超出了最大可用内存。你有两个选择:
Increase the maximum allowed memory to each PHP script either by configuration (
memory_limit
directive inphp.ini
) or in execution time by usingini_set('memory_limit', '200M')
Improve the code to handle only the required information.
通过配置(
memory_limit
指令 inphp.ini
)或使用执行时间增加每个 PHP 脚本的最大允许内存ini_set('memory_limit', '200M')
改进代码以仅处理所需的信息。