Too many open files

Issue

Get in Jenkins a stacktrace which says Too many open files like:

Caused by: java.io.IOException: Too many open files
	at java.io.UnixFileSystem.createFileExclusively(Native Method)
	at java.io.File.createNewFile(File.java:1006)
	at java.io.File.createTempFile(File.java:1989)

or

java.net.SocketException: Too many open files
	at java.net.PlainSocketImpl.socketAccept(Native Method)
	at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:398)

Environment

  • CloudBees Jenkins Enterprise
  • CloudBees Jenkins Operations Center

Resolution

Check at the user level the number of open files you are allowed. To see the current limits of your system, run ulimit -a on the command-line with the user running Jenkins (usually jenkins, jenkins-oc if you’re running CJOC). You should see something like this:

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 30
file size               (blocks, -f) unlimited
pending signals                 (-i) 30654
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 99
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1024
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

To increase limits, add these lines to /etc/security/limits.conf:

jenkins      soft   nofile  4096
jenkins      hard   nofile  8192
jenkins      soft   nproc   30654        
jenkins      hard   nproc   30654

Note that this assumes jenkins is the Unix user running the Jenkins process. If you’re running JOC, the user is probably jenkins-oc.

You can now logout and login and check that the limits are correctly modified with ulimit -a.

Limits are applied when the Unix user logs in: you must restart Jenkins to get the new limits.

If after setting this you still encounter open file descriptor issues, it is possible there is a file handle leak which is causing this problem to appear eventually despite any fixed limit. To track these down you will need to install the File Leak Detector plugin. More information.

Docker

If Jenkins or a Jenkins slave is running inside a container, you need to increase these limits inside the container. Before Docker 1.6, all containers inherited the ulimits of the docker deamon. Since Docker 1.6, it is possible to configure the user limits to apply to a container.

You can change the deamon default limit to apply it to all containers:

docker -d --default-ulimit nofile=4096:8192

You can also override default values on a specific container:

docker run --name my-jenkins-container --ulimit nofile=4096:8192 -p 8080:8080 my-jenkins-image ...

Note: By default, linux set the nproc limit to the maximum value. It is possible to set up the nproc to be used by a container but be aware the nproc is a per user value and not a “per container” value.

More information about Docker ulimits can be found here: https://docs.docker.com/engine/reference/commandline/run/
More information about the Docker deamon configuration can be found here: https://docs.docker.com/engine/reference/commandline/daemon/

Have more questions? Submit a request

2 Comments

  • 0
    Avatar
    Jason Azze

    "Note that this assumes jenkins is the Unix user running the Jenkins process."
    Indeed. And if you're running JOC, the user is probably jenkins-oc. It only took me a couple of hours to figure out I was changing the limits for the wrong user.

  • 0
    Avatar
    Steven Christenson

    Based on our results, it may NOT be sufficient to increase the Jenkins limits if the system limit is lower.

    Here is one guide that may help

    I
    n summary:

    1. Check the system limit:   cat /proc/sys/fs/file-max
    2. Increase the system limit:   sudo sysctl -w fs.file-max=100000
    3. Make the increase permanent:  sudo vi /etc/sysctl.conf
         change or add the line "fs.file-max = 100000"
    4. If you are spawning processes that run as other users, you may need to add /etc/security/limits.conf for those user(s) as well.

    Jenkins may need to be restarted to make all changes take effect.

    If you have the metrics plugin installed, you can monitor the file-handle limit RATIO (called vm.file.descriptor.ratio)

    There is a known file handle leak in 2.60.x Jenkins that manifests as a constant leak of log files. We've watched Jenkins accumulate 1000 additional file handles per day. 

    Edited by Steven Christenson
Please sign in to leave a comment.