Recently I received the error "java.lang.OutOfMemoryError: Unable to create new native thread" and when we debugged the issue on the linux machine the root cause was not related to memory but something totally different. In Java when OS denies to create more threads because limit of number of processes have hit the limit then this error get mapped to java.lang.OutOfMemoryError because there is no specific error defined in java for capturing denial of creation of new thread due to hitting limit of number of processes.
If you want to check limits on a linux machine then you need to run the following command:
$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 62837
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 16384
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 1024
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
"max user process" define maximum number of child processes/threads a root level unix process can open. There are soft limits and hard limits, soft limits can be set on a process and it applicable to that process and child processes and hard limit is applicable to all processes of that user.
In our case soft limit was configured to 1024 and we were trying to create more number of threads.
The limit is defined in file "/etc/security/limits.d/90-nproc.conf" we changed number of processes to 2048 and our application started working.
# Default limit for number of user's processes to prevent
# accidental fork bombs.
# See rhbz #432903 for reasoning.
* soft nproc 1024
root soft nproc unlimited
In Java denial of any resource by OS to application maps to java.lang.OutOfMemoryError and real reason might not be related to memory.
No comments:
Post a Comment