java ResultSet 中的内存不足 allocLargeObjectOrArray

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/940800/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-29 14:29:13  来源:igfitidea点击:

Out of Memory allocLargeObjectOrArray from ResultSet

javajdbcbea

提问by Nick

I'm using JDBC to get a large amount of data. The call completes successfully, but when resultSet.next()is called, I get the following error:

我正在使用 JDBC 来获取大量数据。调用成功完成,但在resultSet.next()调用时,出现以下错误:

java.lang.OutOfMemoryError: allocLargeObjectOrArray - Object size: 15414016, Num elements: 7706998

I've attempted to increase the JVM memory size, but this does not fix the problem. I'm not sure this problem can even be addressed as I'm not using JDBC to access a database, rather, the system is accessing a BEA AquaLogic service through JDBC.

我试图增加 JVM 内存大小,但这并不能解决问题。我不确定这个问题是否可以解决,因为我没有使用 JDBC 访问数据库,而是系统通过 JDBC 访问 BEA AquaLogic 服务。

Has anyone run into this error?

有没有人遇到过这个错误?

采纳答案by Kosi2801

Beware that until the first resultSet.next() call the results may not yet be read from the database or still be in another caching structure somewhere.

请注意,在第一次 resultSet.next() 调用之前,结果可能尚未从数据库中读取或仍在其他缓存结构中。

You should try limit your Select to return a sane amount of results and maybe repeat the call until there are no more results left if you need all the data.

您应该尝试限制您的 Select 以返回合理数量的结果,并且如果您需要所有数据,可能会重复调用直到没有更多结果。

Increasing the JVM memory size won't help unless you can be sure that there is an absolute limit on the amount of data which will be returned by your JDBC call.

除非您可以确定 JDBC 调用返回的数据量有绝对限制,否则增加 JVM 内存大小无济于事。

Furthermore, accessing any service through JDBC essentially boils down to using JDBC :)

此外,通过 JDBC 访问任何服务本质上归结为使用 JDBC :)

Another (unlikely) possibility could be that there is a bug in the JDBC driver you're using. Try a different implementation if it is possible and check if the problem persists.

另一种(不太可能)的可能性是您正在使用的 JDBC 驱动程序中存在错误。如果可能,请尝试不同的实现并检查问题是否仍然存在。

回答by ikelly

First-- figure out if you really need to get that much data in memory at once. RDBMS's are good at aggregating/sorting/etc large data sets, and you should try to take advantage of that if possible.

首先——弄清楚你是否真的需要一次在内存中获取那么多数据。RDBMS 擅长聚合/排序/等大型数据集,如果可能,您应该尝试利用这一点。

If not (and you really, really do need that much data in working memory for some reason)... and bumping up the JVM's memory args doesn't raise the bar enough... look into an in-memory distributed caching solution like Coherence(COTS) or TerraCotta (open source).

如果不是(并且出于某种原因,您确实确实确实需要工作内存中的那么多数据)……并且提高 JVM 的内存参数并不足以提高标准……查看内存中的分布式缓存解决方案,例如Coherence(COTS) 或 TerraCotta(开源)。

回答by rafn

How many rows are you returning from the database? like kosi2801 I would suggest to only fetch a subset of the data, start with a reasonable number and then increase to find the threshold.

你从数据库返回多少行?像 kosi2801 一样,我建议只获取数据的一个子集,从一个合理的数字开始,然后增加以找到阈值。

回答by Charlie

You can try setting the setFetchSize(int rows) method on your statement.
But setFetchRows is only a hint, which means it may not be implemented.

您可以尝试在您的语句中设置 setFetchSize(int rows) 方法。
但是 setFetchRows 只是一个提示,这意味着它可能无法实现。

回答by Peter Lawrey

Try increasing the memory size to 1.2g e.g. -mx1200m or something just less than the physical memory of your machine. You may find it is reading more data at once than your think.

尝试将内存大小增加到 1.2g,例如 -mx1200m 或仅小于您机器物理内存的大小。您可能会发现它一次读取的数据比您想象的要多。