can a OOM be caused by not finding enough contiguous memory?
Posted
by
raticulin
on Stack Overflow
See other posts from Stack Overflow
or by raticulin
Published on 2011-11-29T09:37:28Z
Indexed on
2011/11/29
9:50 UTC
Read the original article
Hit count: 393
I start some java code with -Xmx1024m, and at some point I get an hprof due to OOM. The hprof shows just 320mb, and give me a stack trace:
at java.util.Arrays.copyOfRange([CII)[C (Arrays.java:3209)
at java.lang.String.<init>([CII)V (String.java:215)
at java.lang.StringBuilder.toString()Ljava/lang/String; (StringBuilder.java:430)
...
This comes from a large string I am copying.
I remember reading somewhere (cannot find where) what happened is these cases is:
- process still has not consumed 1gb of memory, is way below
- even if heap still below 1gb, it needs some amount of memory, and for
copyOfRange()
it has to be continuous memory, so even if it is not over the limit yet, it cannot find a large enough piece of memory on the host, it fails with an OOM.
I have tried to look for doc on this (copyOfRange()
needs a block of continuous memory), but could not find any.
The other possible culprit would be not enough permgen memory.
Can someone confirm or refute the continuous memory hypothesis? Any pointer to some doc would help too.
© Stack Overflow or respective owner