php 为什么 PHPExcel 不允许写入超过 5000 行

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/4895230/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-25 16:11:49  来源:igfitidea点击:

Why PHPExcel does not allow to write more than 5000 rows

phpphpexcel

提问by Prabhu M

Can any one please tell me Why PHPExcel does not allow more than 5000 rows. I am using an open-source PHPExcel for report generation on my projects and i could not write more than 5000 rows of data from Mysql-DB. My result set fetch 7230 records when the query is executed. How do i fix it..

任何人都可以告诉我为什么 PHPExcel 不允许超过 5000 行。我正在使用开源 PHPExcel 为我的项目生成报告,但我无法从 Mysql-DB 写入超过 5000 行的数据。执行查询时,我的结果集获取了 7230 条记录。我如何解决它..

回答by Mark Baker

Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).

几乎可以肯定这是超时或内存问题。PHPExcel 对工作表大小的唯一限制是 65,536 行和 256 (IV) 列(使用 Excel5 Writer 时);或 1,048,576 行和 16,384 (XFD) 列(使用 Excel2007 Writer 时)。

Ensure that your error logging is always enabled... use try/catch blocks to trap for any PHPExcel Exceptions. And read the PHPExcel site discussion threads on memory and performance.

确保您的错误日志始终处于启用状态...使用 try/catch 块来捕获任何 PHPExcel 异常。并阅读有关内存和性能的 PHPExcel 站点讨论线程。

回答by Vlad

I had the same problem. You will need to allocate enough time and memory limit.

我有同样的问题。您将需要分配足够的时间和内存限制。

I have tested my solution on 3 different server here is the result:

我在 3 个不同的服务器上测试了我的解决方案,结果如下:

About 5000 records (12 columns)

约 5000 条记录(12 列)

Reading file:
09:48:22 Peak memory usage: 1.25 MB

读取文件:
09 : 48 : 22 峰值内存使用:1.25 MB

Reading data:
09:48:31 Peak memory usage: 54.5 MB

读取数据:
09 : 48 : 31 峰值内存使用:54.5 MB

After indexing data into an array:
09:48:35 Peak memory usage: 68.5 MB

将数据索引到数组后:
09 : 48 : 35 峰值内存使用:68.5 MB

Records: 4504

记录:4504

I increased the memory and time to read 22.000 records after indexing it went up to 370.00MB

索引后我增加了读取 22.000 条记录的内存和时间,达到 370.00MB

Here is the solution (being given that everything else is correct in the code sequence) where you call PHPExcel in your program/function:

这是您在程序/函数中调用 PHPExcel 的解决方案(假设代码序列中的其他所有内容都是正确的):

ini_set("max_execution_time", 'time_limit'); //see manual

Do all initialization here so that all objects are ready then allocate memory for reading the file and indexing data into program internal structure:

在这里完成所有初始化,以便所有对象都准备就绪,然后分配内存以读取文件并将数据索引到程序内部结构中:

ini_set('memory_limit', '???'); //your memory limit as string
$excel = $excelReader->load($filePath);
"Memory usage: " . (memory_get_peak_usage(true) / 1024 / 1024) . " MB"
//do the rest of the structure!

A good idea is to have managed all this by some categories of data so you don't run into 400 MB - PRONE TO ERRORS!

一个好主意是通过某些类别的数据来管理所有这些,这样您就不会遇到 400 MB - 容易出错!

回答by Shrikant Jadhav

Almost certainly this is a timeout or a memory issue. The only PHPExcel limit for worksheets size is 65,536 rows and 256 (IV) columns (when using the Excel5 Writer); or 1,048,576 rows and 16,384 (XFD) columns (when using the Excel2007 Writer).

几乎可以肯定这是超时或内存问题。PHPExcel 对工作表大小的唯一限制是 65,536 行和 256 (IV) 列(使用 Excel5 Writer 时);或 1,048,576 行和 16,384 (XFD) 列(使用 Excel2007 Writer 时)。

You can change this line

你可以改变这一行

$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel5');

as

作为

$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');

Then it allows to write records more than 65536 rows.

然后它允许写入超过 65536 行的记录。

回答by Damien Pirsy

Without having your code or the class's code is quite difficult I believe...Do you mean you can't write more than 5k rows in a XLS file, or inside a worksheet? otherwise, an ugly workaround could be writing 5K rows in first sheet and the rest in the second (so 5K rows each sheet, if DB gets bigger). I don't think XLS has a 5k rows limitations, so there should be something wrong or misconfigured in your script..Have you tried several times? Does it always print 5k rows? or could it be due to timeouts? (of your script or connection)

没有你的代码或类的代码是相当困难的,我相信......你的意思是你不能在 XLS 文件或工作表中写入超过 5k 行?否则,一个丑陋的解决方法可能是在第一个工作表中写入 5K 行,而在第二个工作表中写入其余部分(因此,如果 DB 变大,则每张工作表写入 5K 行)。我不认为 XLS 有 5k 行的限制,所以你的脚本中应该有问题或配置错误..你试过几次了吗?它总是打印 5k 行吗?或者可能是由于超时?(您的脚本或连接)

回答by DoctorLouie

This type of issue is more than likely a server memory issue. What type of server are you on and are you sure it has enough memory and resources available to process large data files? If you cant tell either way, the best work around is to read in a few thousand records at a time, process them and then move to the next chunk. I myself would prefer to break the large data file into manageable pieces (files), upon which I could subsequently process each of those pieces to get my desired outcome. Once all pieces are processed, they can then be merged together to make a new large data file.

这种类型的问题很可能是服务器内存问题。您使用的是什么类型的服务器,您确定它有足够的内存和资源来处理大型数据文件吗?如果您无法分辨,最好的解决方法是一次读取几千条记录,处理它们,然后移动到下一个块。我自己更喜欢将大数据文件分解成可管理的部分(文件),然后我可以在这些部分上处理每个部分以获得我想要的结果。处理完所有部分后,可以将它们合并在一起以制作新的大数据文件。