Java 将大型 hprof 加载到 jhat
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/1835743/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Loading a large hprof into jhat
提问by liam
I have a 6.5GB Hprof file that was dumped by a 64-bit JVM using the -XX:-HeapDumpOnOutOfMemoryError
option. I have it sitting on a 16GB 64-bit machine, and am trying to get it into jhat, but it keeps running out of memory. I have tried passing in jvm args for minimum settings, but it rejects any minimum, and seems to run out of memory before hitting the maximum.
我有一个 64 位 JVM 使用该-XX:-HeapDumpOnOutOfMemoryError
选项转储的 6.5GB Hprof 文件。我将它放在 16GB 64 位机器上,并试图将它放入 jhat,但它一直在耗尽内存。我试过传入 jvm args 进行最小设置,但它拒绝任何最小值,并且似乎在达到最大值之前内存不足。
It seems kind of silly that a jvm running out of memory dumps a heap so large that it can't be loaded on a box with twice as much ram. Are there any ways of getting this running, or possibly amortizing the analysis?
一个内存不足的 jvm 转储一个如此大的堆,以至于它无法加载到具有两倍内存的盒子上,这似乎有点愚蠢。有没有办法让这个运行,或者可能分摊分析?
采纳答案by broschb
I would take a look at the eclipse memory analyzer. This tool is great, and I have looked at several Gig heaps w/ this tool. The nice thing about the tool is it creates indexes on the dump so it is not all in memory at once.
我会看看Eclipse 内存分析器。这个工具很棒,我已经使用这个工具查看了几个 Gig 堆。该工具的好处是它在转储上创建索引,因此它不是一次全部在内存中。
回答by Kevin
What flags are you passing to jhat? Make sure that you're in 64-bit mode and you're setting the heap size large enough.
你传递给 jhat 的标志是什么?确保您处于 64 位模式并且您设置的堆大小足够大。
回答by Joel Hoff
Use the equivalent of jhat -J-d64 -J-mx16g myheap.hprof
as a command to launch jhat, i.e., this will start jhat in 64-bit mode with a maximum heap size of 16 gigabytes.
使用等效的jhat -J-d64 -J-mx16g myheap.hprof
命令来启动 jhat,即这将在 64 位模式下启动 jhat,最大堆大小为 16 GB。
If the JVM on your platform defaults to 64-bit-mode operation, then the -J-d64
option should be unnecessary.
如果您平台上的 JVM 默认为 64 位模式操作,则该-J-d64
选项应该是不必要的。
回答by vimil
I had to load a 11 GB hprof file and couldn't with eclipse memory analyzer. What I ended up doing was to write a program to reduce the size of the hprof file by randomly removing instance information. Once I got the size of the hprof file down to 1GB, I could open it with eclipse memory analyzer and get a clue on what was causing the memory leak.
我不得不加载一个 11 GB 的 hprof 文件并且不能使用 eclipse 内存分析器。我最终做的是编写一个程序,通过随机删除实例信息来减小 hprof 文件的大小。一旦我将 hprof 文件的大小降低到 1GB,我就可以使用 eclipse 内存分析器打开它并获得有关导致内存泄漏的线索。