java.lang.OutOfMemoryError: unable to create new native thread
Posted
by
Brad
on Server Fault
See other posts from Server Fault
or by Brad
Published on 2011-05-18T04:23:39Z
Indexed on
2011/06/20
16:25 UTC
Read the original article
Hit count: 432
I consistently get this exception when trying to run my Junit tests on my mac:
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:658)
at java.util.concurrent.ThreadPoolExecutor.addIfUnderMaximumPoolSize(ThreadPoolExecutor.java:727)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:657)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:92)
at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:197)
at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:184)
at java.security.AccessController.doPrivileged(Native Method)
at com.google.appengine.tools.development.ApiProxyLocalImpl.doAsyncCall(ApiProxyLocalImpl.java:172)
at com.google.appengine.tools.development.ApiProxyLocalImpl.makeAsyncCall(ApiProxyLocalImpl.java:138)
The same set of unit tests pass perfectly fine on ubuntu and windows.
Some information about my system resources on the mac:
$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 266
virtual memory (kbytes, -v) unlimited
$ java -version
java version "1.6.0_24"
Java(TM) SE Runtime Environment (build 1.6.0_24-b07-334-10M3326)
Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02-334, mixed mode)
The reason I dont think this is an application issue is because the same tests pass in different environments. I have tried setting heap to 1024m, 512m and setting the stack to 64k and 128k (and each of these combinations) with no luck. My open files was originally 256 and I have bumped this to 1024.
I have been googling around for a bit and all posts say to decrease heap size and increase stack size but that doesnt seem to help. Anyone have anymore ideas?
EDIT: Here are is some environment information on my ubuntu box:
$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 20
file size (blocks, -f) unlimited
pending signals (-i) 16382
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) unlimited
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
$ java -version
java version "1.6.0_24"
Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02, mixed mode)
© Server Fault or respective owner