vba 如何清除内存以防止excel vba中的“内存不足错误”?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/14396998/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to clear memory to prevent "out of memory error" in excel vba?
提问by Marc L
I am running VBA code on a large spreadsheet. How do I clear the memory between procedures/calls to prevent an "out of memory" issue occurring?
我在大型电子表格上运行 VBA 代码。如何清除程序/调用之间的内存以防止发生“内存不足”问题?
Thanks
谢谢
回答by assylias
The best way to help memory to be freed is to nullify large objects:
帮助释放内存的最佳方法是使大对象无效:
Sub Whatever()
Dim someLargeObject as SomeObject
'expensive computation
Set someLargeObject = Nothing
End Sub
Also note that global variables remain allocated from one call to another, so if you don't need persistence you should either not use global variables or nullify them when you don't need them any longer.
另请注意,全局变量从一个调用到另一个调用保持分配状态,因此如果您不需要持久性,则不应该使用全局变量,或者在不再需要它们时将它们取消。
However this won't help if:
但是,如果出现以下情况,这将无济于事:
- you need the object after the procedure (obviously)
- your object does not fit in memory
- 程序后您需要对象(显然)
- 你的对象不适合内存
Another possibility is to switch to a 64 bit version of Excel which should be able to use more RAM before crashing (32 bits versions are typically limited at around 1.3GB).
另一种可能性是切换到 64 位版本的 Excel,它应该能够在崩溃前使用更多 RAM(32 位版本通常限制在 1.3GB 左右)。
回答by Arne Larsson
I've found a workaround. At first it seemed it would take up more time, but it actually makes everything work smoother and faster due to less swapping and more memory available. This is not a scientific approach and it needs some testing before it works.
我找到了解决方法。起初它似乎会占用更多时间,但实际上由于交换更少和可用内存更多,它实际上使一切工作更顺畅、更快。这不是一种科学的方法,它需要一些测试才能起作用。
In the code, make Excel save the workbook every now and then. I had to loop through a sheet with 360 000 lines and it choked badly. After every 10 000 I made the code save the workbook and now it works like a charm even on a 32-bit Excel.
在代码中,让 Excel 不时保存工作簿。我不得不循环使用 360 000 行的工作表,结果严重窒息。每 10 000 次之后,我就让代码保存了工作簿,现在即使在 32 位 Excel 上它也能像魅力一样工作。
If you start Task Manager at the same time you can see the memory utilization go down drastically after each save.
如果您同时启动任务管理器,您可以看到每次保存后内存利用率急剧下降。
回答by Dreamwalker
Answer is you can't explicitly but you should be freeing memory in your routines.
答案是你不能明确但你应该在你的例程中释放内存。
Some tips though to help memory
一些提示虽然有助于记忆
- Make sure you set object to null before exiting your routine.
- Ensure you call Close on objects if they require it.
- Don't use global variables unless absolutely necessary
- 确保在退出例程之前将 object 设置为 null。
- 如果需要,请确保对对象调用 Close。
- 除非绝对必要,否则不要使用全局变量
I would recommend checking the memory usage after performing the routine again and again you may have a memory leak.
我建议在一次又一次地执行例程后检查内存使用情况,您可能会出现内存泄漏。
回答by Ryszard J?draszyk
If you operate on a large dataset, it is very possible that arrays will be used. For me creating a few arrays from 500 000 rows and 30 columns worksheet caused this error. I solved it simply by using the line below to get rid of array which is no longer necessary to me, before creating another one:
如果您对大型数据集进行操作,则很可能会使用数组。对我来说,从 500 000 行和 30 列工作表创建几个数组导致了这个错误。在创建另一个数组之前,我简单地通过使用下面的行来摆脱对我来说不再需要的数组来解决它:
Erase vArray
Also if only 2 columns out of 30 are used, it is a good idea to create two 1-column arrays instead of one with 30 columns. It doesn't affect speed, but there will be a difference in memory usage.
此外,如果仅使用 30 列中的 2 列,则最好创建两个 1 列数组,而不是一个具有 30 列的数组。不影响速度,但是内存使用会有差异。
回答by Stevetb77
Found this thread looking for a solution to my problem. Mine required a different solution that I figured out that might be of use to others. My macro was deleting rows, shifting up, and copying rows to another worksheet. Memory usage was exploding to several gigs and causing "out of memory" after processing around only 4000 records. What solved it for me?
发现这个线程寻找我的问题的解决方案。我的需要一个不同的解决方案,我发现它可能对其他人有用。我的宏正在删除行、向上移动并将行复制到另一个工作表。仅处理大约 4000 条记录后,内存使用量激增至数次并导致“内存不足”。什么给我解决了?
application.screenupdating = false
application.screenupdating = false
Added that at the beginning of my code (be sure to make it true again, at the end) I knew that would make it run faster, which it did.. but had no idea about the memory thing.
在我的代码的开头添加了这一点(一定要在最后再次使它成为真的)我知道这会使它运行得更快,它确实做到了......但不知道内存问题。
After making this small change the memory usage didn't exceed 135 mb. Why did that work? No idea really. But it's worth a shot and might apply to you.
进行这个小改动后,内存使用量没有超过 135 mb。为什么这样做?真的不知道。但这值得一试,可能适用于您。
回答by TekiusFanatikus
I was able to fix this error by simply initializing a variable that was being used later in my program. At the time, I wasn't using Option Explicit in my class/module.
我能够通过简单地初始化一个稍后在我的程序中使用的变量来修复这个错误。当时,我没有在我的类/模块中使用 Option Explicit。
回答by Mike
I had a similar problem that I resolved myself.... I think it was partially my code hogging too much memory while too many "big things"
我有一个类似的问题,我自己解决了......我认为部分原因是我的代码占用了太多内存而太多“大事”
in my application - the workbook goes out and grabs another departments "daily report".. and I extract out all the information our team needs (to minimize mistakes and data entry).
在我的应用程序中 - 工作簿出去并抓取另一个部门的“每日报告”......我提取了我们团队需要的所有信息(以尽量减少错误和数据输入)。
I pull in their sheets directly... but I hate the fact that they use Merged cells... which I get rid of (ie unmerge, then find the resulting blank cells, and fill with the values from above)
我直接拉入他们的工作表......但我讨厌他们使用合并单元格的事实......我摆脱了(即取消合并,然后找到生成的空白单元格,并用上面的值填充)
I made my problem go away by
我让我的问题消失了
a)unmerging only the "used cells" - rather than merely attempting to do entire column... ie finding the last used row in the column, and unmerging only this range (there is literally 1000s of rows on each of the sheet I grab)
a)仅取消合并“使用过的单元格” - 而不是仅仅尝试完成整列......即找到列中最后使用的行,并且仅取消合并这个范围(我抓取的每个工作表上实际上有 1000 行)
b) Knowing that the undo only looks after the last ~16 events... between each "unmerge" - i put 15 events which clear out what is stored in the "undo" to minimize the amount of memory held up (ie go to some cell with data in it.. and copy// paste special value... I was GUESSING that the accumulated sum of 30sheets each with 3 columns worth of data might be taxing memory set as side for undoing
b) 知道撤消只关注最后的 ~16 个事件......在每次“取消合并”之间 - 我放置了 15 个事件来清除“撤消”中存储的内容以最小化占用的内存量(即转到一些带有数据的单元格......并复制//粘贴特殊值......我猜测每张有3列数据的30张的累计总和可能会占用内存设置为撤消
Yes it doesn't allow for any chance of an Undo... but the entire purpose is to purge the old information and pull in the new time sensitive data for analysis so it wasn't an issue
是的,它不允许有任何撤消的机会......但整个目的是清除旧信息并提取新的时间敏感数据进行分析,所以这不是问题
Sound corny - but my problem went away
听起来很老套 - 但我的问题消失了