Linux IOException: 打开的文件太多

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/2044672/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-03 19:40:33  来源:igfitidea点击:

IOException: Too many open files

javalinuxjettyioexceptionfile-descriptor

提问by Mirko N.

I'm trying to debug a file descriptor leak in a Java webapp running in Jetty 7.0.1 on Linux.

我正在尝试在 Linux 上的 Jetty 7.0.1 中运行的 Java webapp 中调试文件描述符泄漏。

The app had been happily running for a month or so when requests started to fail due to too many open files, and Jetty had to be restarted.

该应用程序已经愉快地运行了一个月左右,但由于打开的文件太多,请求开始失败,必须重新启动 Jetty。

java.io.IOException: Cannot run program [external program]: java.io.IOException: error=24, Too many open files
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
    at java.lang.Runtime.exec(Runtime.java:593)
    at org.apache.commons.exec.launcher.Java13CommandLauncher.exec(Java13CommandLauncher.java:58)
    at org.apache.commons.exec.DefaultExecutor.launch(DefaultExecutor.java:246)

At first I thought the issue was with the code that launches the external program, but it's using commons-execand I don't see anything wrong with it:

起初我认为问题出在启动外部程序的代码上,但它使用的是commons-exec,我看不出有什么问题:

CommandLine command = new CommandLine("/path/to/command")
    .addArgument("...");
ByteArrayOutputStream errorBuffer = new ByteArrayOutputStream();
Executor executor = new DefaultExecutor();
executor.setWatchdog(new ExecuteWatchdog(PROCESS_TIMEOUT));
executor.setStreamHandler(new PumpStreamHandler(null, errorBuffer));
try {
    executor.execute(command);
} catch (ExecuteException executeException) {
    if (executeException.getExitValue() == EXIT_CODE_TIMEOUT) {
        throw new MyCommandException("timeout");
    } else {
        throw new MyCommandException(errorBuffer.toString("UTF-8"));
    }
}

Listing open files on the server I can see a high number of FIFOs:

列出服务器上打开的文件,我可以看到大量的 FIFO:

# lsof -u jetty
...
java    524 jetty  218w  FIFO        0,6      0t0 19404236 pipe
java    524 jetty  219r  FIFO        0,6      0t0 19404008 pipe
java    524 jetty  220r  FIFO        0,6      0t0 19404237 pipe
java    524 jetty  222r  FIFO        0,6      0t0 19404238 pipe

when Jetty starts there are just 10 FIFOs, after a few days there are hundreds of them.

当 Jetty 启动时,只有 10 个 FIFO,几天后就会有数百个。

I know it's a bit vague at this stage, but do you have any suggestions on where to look next, or how to get more detailed info about those file descriptors?

我知道在这个阶段有点含糊,但是您对下一步看哪里有什么建议,或者如何获得有关这些文件描述符的更详细信息?

回答by Greg Smith

As you are running on Linux I suspect you are running out of file descriptors. Check out ulimit. Here is an article that describes the problem: http://www.cyberciti.biz/faq/linux-increase-the-maximum-number-of-open-files/

当您在 Linux 上运行时,我怀疑您的文件描述符用完了。查看 ulimit。这是描述问题的文章:http: //www.cyberciti.biz/faq/linux-increase-the-maximum-number-of-open-files/

回答by alasdairg

Don't know the nature of your app, but I have seen this error manifested multiple times because of a connection pool leak, so that would be worth checking out. On Linux, socket connections consume file descriptors as well as file system files. Just a thought.

不知道您的应用程序的性质,但我已经看到由于连接池泄漏而多次出现此错误,因此值得一试。在 Linux 上,套接字连接使用文件描述符以及文件系统文件。只是一个想法。

回答by allenjsomb

You can handle the fds yourself. The exec in java returns a Process object. Intermittently check if the process is still running. Once it has completed close the processes STDERR, STDIN, and STDOUT streams (e.g. proc.getErrorStream.close()). That will mitigate the leaks.

你可以自己处理 fds。java 中的 exec 返回一个 Process 对象。间歇性地检查进程是否仍在运行。一旦它完成关闭进程 STDERR、STDIN 和 STDOUT 流(例如 proc.getErrorStream.close())。这将减轻泄漏。

回答by ofavre

The problem comes from your Java application (or a library you are using).

问题来自您的 Java 应用程序(或您正在使用的库)。

First, you should read the entire outputs (Google for StreamGobbler), and pronto!

首先,您应该阅读整个输出(Google for StreamGobbler),然后立即!

Javadocsays:

Javadoc说:

The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.

父进程使用这些流向子进程提供输入和从子进程获取输出。由于一些原生平台只为标准输入输出流提供有限的缓冲区大小,如果不能及时写入输入流或读取子进程的输出流,可能会导致子进程阻塞,甚至死锁。

Secondly, waitFor()your process to terminate. You then should close the input, output and error streams.

其次waitFor()你的进程终止。然后您应该关闭输入、输出和错误流。

Finallydestroy()your Process.

最后destroy()你的过程。

My sources:

我的消息来源:

回答by Rami Jaamour

Aside from looking into root cause issues like file leaks, etc. in order to do a legitimate increase the "open files" limit and have that persist across reboots, consider editing

除了调查文件泄漏等根本原因问题之外,为了合理增加“打开文件”限制并在重新启动后保持不变,请考虑编辑

/etc/security/limits.conf

by adding something like this

通过添加这样的东西

jetty soft nofile 2048
jetty hard nofile 4096

where "jetty" is the username in this case. For more details on limits.conf, see http://linux.die.net/man/5/limits.conf

在这种情况下,“jetty”是用户名。有关limits.conf 的更多详细信息,请参阅http://linux.die.net/man/5/limits.conf

log off and then log in again and run

注销然后重新登录并运行

ulimit -n

to verify that the change has taken place. New processes by this user should now comply with this change. This linkseems to describe how to apply the limit on already running processes but I have not tried it.

以验证更改是否已发生。此用户的新进程现在应符合此更改。此链接似乎描述了如何对已运行的进程应用限制,但我还没有尝试过。

The default limit 1024 can be too low for large Java applications.

对于大型 Java 应用程序,默认限制 1024 可能太低。

回答by Vpn_talent

This problem comes when you are writing data in many files simultaneously and your Operating System has a fixed limit of Open files. In Linux, you can increase the limit of open files.

当您同时在多个文件中写入数据并且您的操作系统具有固定的打开文件限制时,就会出现此问题。在 Linux 中,您可以增加打开文件的限制。

https://www.tecmint.com/increase-set-open-file-limits-in-linux/

https://www.tecmint.com/increase-set-open-file-limits-in-linux/

How do I change the number of open files limit in Linux?

如何更改 Linux 中打开文件的数量限制?