PHP 警告:exec() 无法分叉

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/20648949/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-25 03:22:11  来源:igfitidea点击:

PHP Warning: exec() unable to fork

phpcentos

提问by Dane Landan Harvey

So here is a little background info on my setup. Running Centos with apache and php 5.2.17. I have a website that lists products from many different retailers websites. I have crawler scripts that run to grab products from each website. Since every website is different, each crawler script had to be customized to crawl the particular retailers website. So basically I have 1 crawler per retailer. At this time I have 21 crawlers that are constantly running to gather and refresh the products from these websites. Each crawler is a php file and once the php script is done running it checks to ensure its the only instance of itself running and at the very end of the script it uses exec to start itself all over again while the original instance closes. This helps protect against memory leaks since each crawler restarts itself before it closes. However recently I will check the crawler scripts and notice that one of them Isnt running anymore and in the error log I find the following.

所以这里有一些关于我的设置的背景信息。使用 apache 和 php 5.2.17 运行 Centos。我有一个网站,列出了许多不同零售商网站的产品。我有爬虫脚本,可以从每个网站抓取产品。由于每个网站都不同,因此必须定制每个爬虫脚本以爬取特定的零售商网站。所以基本上我每个零售商有 1 个爬虫。目前,我有 21 个爬虫,它们不断地运行以从这些网站收集和刷新产品。每个爬虫都是一个 php 文件,一旦 php 脚本运行完毕,它就会检查以确保它是自己运行的唯一实例,并且在脚本的最后,它使用 exec 在原始实例关闭时重新启动它自己。这有助于防止内存泄漏,因为每个爬虫在关闭之前都会重新启动。

PHP Warning:  exec() [<a href='function.exec'>function.exec</a>]: Unable to fork [nice -n 20 php -q /home/blahblah/crawler_script.php &gt;/dev/null &amp;]

This is what is supposed to start this particular crawler over again however since it was "unable to fork" it never restarted and the original instance of the crawler ended like it normally does.

这是应该重新启动这个特定爬虫的原因,但是由于它“无法分叉”,它从未重新启动,并且爬虫的原始实例像往常一样结束。

Obviously its not a permission issue because each of these 21 crawler scripts runs this exec command every 5 or 10 minutes at the end of its run and most of the time it works as it should. This seems to happen maybe once or twice a day. It seems as though its a limit of some sort as I have only just recently started to see this happen ever since I added my 21st crawler. And its not always the same crawler that gets this error it will be any one of them at a random time that are unable to fork its restart exec command.

显然这不是权限问题,因为这 21 个爬虫脚本中的每一个在运行结束时每 5 或 10 分钟运行一次此 exec 命令,并且大部分时间它都按预期工作。这似乎每天发生一两次。这似乎是某种限制,因为自从我添加第 21 个爬虫以来,我最近才开始看到这种情况发生。并且它并不总是出现此错误的同一个爬虫,它会是随机时间中的任何一个,无法分叉其重新启动 exec 命令。

Does anyone have an idea what could be causing php to be unable to fork or maybe even a better way to handle these processes as to get around the error all together? Is there a process limit I should look into or something of that nature? Thanks in advance for help!

有没有人知道什么可能导致 php 无法分叉,或者甚至有更好的方法来处理这些进程以解决错误?是否有我应该研究的流程限制或类似的东西?预先感谢您的帮助!

回答by Jason Heo

Process limit

进程限制

"Is there a process limit I should look into"

“是否有我应该研究的进程限制”

It's suspected somebody (system admin?) set limitation of max user process. Could you try this?

怀疑有人(系统管理员?)设置了max user process. 你能试试这个吗?

$ ulimit -a
....
....
max user processes              (-u) 16384
....

Run preceding command in PHP. Something like :

在 PHP 中运行前面的命令。就像是 :

echo system("ulimit -a");

I searched whether php.ini or httpd.conf has this limit, but I couldn't find it.

我搜索了 php.ini 或 httpd.conf 是否有这个限制,但我找不到。

Error Handling

错误处理

"even a better way to handle these processes as to get around the error all together?"

“甚至有更好的方法来处理这些过程以解决错误吗?

The third parameter of exec()returns exit code of $cmd. 0 for success, non zero for error code. Refer to http://php.net/function.exec.

的第三个参数exec()返回 的退出代码$cmd。0 表示成功,非零表示错误代码。请参阅http://php.net/function.exec

exec($cmd, &$output, &$ret_val);

if ($ret_val != 0)
{
    // do stuff here
}
else
{
    echo "success\n";
}

回答by Derek Illchuk

In my case (large PHPUnit test suite) it would say unable to forkonce the process hit 57% memory usage. So, one more thing to watch for, it may not be a process limit but rather memory.

在我的情况下(大型 PHPUnit 测试套件),unable to fork一旦进程达到 57% 的内存使用率,它就会说。因此,还有一件事需要注意,它可能不是进程限制,而是内存。

回答by Faisal Ameer

I ran into same problem and I tried this and it worked for me;

我遇到了同样的问题,我试过了,它对我有用;

ulimit -n 4096

回答by John Foley

The problem is often caused by the system or the process or running out of available memory. Be sure that you have enough by running free -m. You will get a result like the following:

该问题通常是由系统或进程或可用内存不足引起的。确保你有足够的运行free -m。您将得到如下结果:

total used free shared buffers cached Mem: 7985 7722 262 19 189 803 -/+ buffers/cache: 6729 1255 Swap: 0 0 0

total used free shared buffers cached Mem: 7985 7722 262 19 189 803 -/+ buffers/cache: 6729 1255 Swap: 0 0 0

The buffers/cache line is what you want to look at. Notice free memory is 1255 MB on this machine. When running your program keep trying free -mand check freememory to see if this falls into the low hundreds. If it does you will need to find a way to run you program while consumer less memory.

缓冲区/缓存行是您想要查看的内容。请注意,这台机器上的可用内存为 1255 MB。在运行你的程序时,不断尝试free -m并检查free内存,看看这是否属于低数百。如果是这样,您将需要找到一种方法来运行您的程序,同时消耗更少的内存。

回答by Ligemer

For anyone else who comes across this issue, it could be several problems as outlined in this question's answer.

对于遇到此问题的任何其他人,这可能是此问题的答案中概述的几个问题。

However, my problem was my nginx user did not have a proper shell to execute the commands I wanted. Adding .bashrc to the nginx user's home directory fixed this.

但是,我的问题是我的 nginx 用户没有合适的 shell 来执行我想要的命令。将 .bashrc 添加到 nginx 用户的主目录修复了这个问题。