线程“main”中的异常 java.lang.OutOfMemoryError:超出 GC 开销限制

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/6709742/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-30 16:58:18  来源:igfitidea点击:

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded

javaout-of-memory

提问by Oscar

I can't run my process. It gives the following exception: "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space"

我无法运行我的进程。它给出了以下异常:“线程“main”中的异常 java.lang.OutOfMemoryError: Java heap space”

java -Xms32m -Xmx516m FilteringSNP_genus Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2882) at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:515) at java.lang.StringBuffer.append(StringBuffer.java:306) at java.io.BufferedReader.readLine(BufferedReader.java:345) at java.io.BufferedReader.readLine(BufferedReader.java:362) at FilteringSNP_genus.main(FilteringSNP_genus.java:65)

java -Xms32m -Xmx516m FilteringSNP_genus 线程“main”中的异常) 在 java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:515) 在 java.lang.StringBuffer.append(StringBuffer.java:306) 在 java.io.BufferedReader.readLine(BufferedReader.java:345) 在 java.io .BufferedReader.readLine(BufferedReader.java:362) 在 FilteringSNP_genus.main(FilteringSNP_genus.java:65)

I have tried different memory usage configurations like:

我尝试了不同的内存使用配置,例如:

java -Xms32m -Xmx1024m FilteringSNP_genus

java -Xms32m -Xmx1024m FilteringSNP_genus

but it has not worked, and increasing the -XmxVALUE has given a GC overheadlimit exceeded exception:

但它没有奏效,并且增加 -XmxVALUE 会导致 GC 开销限制超出异常:

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.String.substring(String.java:1940) at java.util.StringTokenizer.nextToken(StringTokenizer.java:335) at FilteringSNP_genus.main(FilteringSNP_genus.java:77)

线程“main”中的异常 java.lang.OutOfMemoryError: GC 开销限制在 java.lang.String.substring(String.java:1940) at java.util.StringTokenizer.nextToken(StringTokenizer.java:335) at FilteringSNP_genus.main 超出(FilteringSNP_genus.java:77)

Could anyone provide some clue to fix this?

谁能提供一些线索来解决这个问题?

Thanks

谢谢

回答by Ryan Stewart

I'd hazard a guess that you're reading from a file or a socket and using readLine()for convenience without considering the implications. Try reading into a char[] bufferinstead.

我会猜测您正在从文件或套接字中读取并使用readLine()为方便起见而不考虑其含义。尝试读入 char[] 缓冲区

Alternately, you're reading lines in and storing hard references to them in memory, so you obviously will run out of room once you read enough.

或者,您正在读取行并将对它们的硬引用存储在内存中,因此一旦您读得足够多,您显然会耗尽空间。

As for the GC overhead error, according to an Oracle JVM article: "The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown."

至于 GC 开销错误,根据Oracle JVM 的一篇文章:“如果垃圾收集花费了太多时间,并行收集器将抛出 OutOfMemoryError:如果总时间的 98% 以上花费在垃圾收集上并且少于回收了 2% 的堆,将抛出 OutOfMemoryError。”

回答by RalphChapin

I have found the GC will give up if it takes too long to find memory, even if the memory is actually there. The simplest problem is when the memory is mostly virtual, which is vastly slower than real memory. Also if memory is too fragmented, the GC can take time to find the space it needs. And if you are allocating big chunks of memory, that can make the situation much worse. A problem like this can be intermittent, working fine when the GC has had time to clean house and keep everything organized and failing when it is overloaded. My guess is that in your case you either have a paging problem or you are using too much of available memory in chunks that are too big.

我发现如果寻找内存的时间太长,GC 会放弃,即使内存确实存在。最简单的问题是当内存大部分是虚拟的时,这比真实内存慢得多。此外,如果内存太碎片化,GC 可能需要时间来找到它需要的空间。如果您要分配大块内存,那会使情况变得更糟。像这样的问题可能是间歇性的,当 GC 有时间清理内部并保持一切井井有条时,它可以正常工作,并且在过载时会失败。我的猜测是,在您的情况下,您要么有分页问题,​​要么在太大的块中使用了过多的可用内存。

Solutions: get more real memory (if paging is the problem). Use less memory. Use smaller pieces of memory. Arrays are the fastest way to crunch numbers, but data structures with pointers make life easier for the GC. If you can figure a way to use smaller arrays (or no arrays), do that.

解决方案:获得更多的真实内存(如果分页是问题所在)。使用更少的内存。使用较小的内存。数组是处理数字的最快方法,但带有指针的数据结构使 GC 的工作更轻松。如果您能想出一种使用较小数组(或不使用数组)的方法,那就去做吧。

It ought to be possible to get a proper 64-bit system (computer and JVM) with 8GB or more of memory so you can ignore and forget this problem, but I've yet to hear of anybody doing that. (And memory use expands to fill the memory available...)

应该有可能获得具有 8GB 或更多内存的合适的 64 位系统(计算机和 JVM),这样您就可以忽略并忘记这个问题,但我还没有听说有人这样做过。(并且内存使用扩展以填充可用内存......)