我长期运行的 laravel 4 命令一直被杀死
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/27950102/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
My long running laravel 4 command keeps being killed
提问by DEzra
I have a laravel 4 web project that implements a laravel command.
我有一个实现 laravel 命令的 laravel 4 web 项目。
When running in the development homestead vm, it runs to completion (about 40 seconds total time).
在开发 homestead vm 中运行时,它会运行到完成(总时间约 40 秒)。
However when running it on the production server, it quits with a 'killed' output on the command line.
但是,在生产服务器上运行它时,它会在命令行上以“killed”输出退出。
At first i thought it was the max_execution_time in cli php.ini, so I set it to 0 (for unlimited time).
起初我以为是 cli php.ini 中的 max_execution_time,所以我将它设置为 0(无限时间)。
How can I find out what is killing my command?
我怎样才能找出是什么杀死了我的命令?
I run it on ssh terminal using the standard artisan invokation:
我使用标准的 artisan 调用在 ssh 终端上运行它:
php artisan commandarea:commandname
php artisan commandarea:commandname
Does laravel 4 have a command time limit somewhere?
laravel 4 在某处有命令时间限制吗?
The vps is a Ubuntu 4.10 machine with mysql, nginx and php-fpm
vps 是一台带有 mysql、nginx 和 php-fpm 的 Ubuntu 4.10 机器
回答by DEzra
So, firstly, thank you everyone who has pointed me in the right direction regarding PHP and laravel memory usage tracking.
所以,首先,感谢所有在 PHP 和 Laravel 内存使用跟踪方面为我指明正确方向的人。
I have answered my own question hoping that it will benefit laravel devs in the future, as my solution was hard to find.
我已经回答了我自己的问题,希望它在未来对 Laravel 开发人员有益,因为我的解决方案很难找到。
After typing 'dmesg' to show system messages. I found that the php script was being killed by Linux.
输入“dmesg”以显示系统消息后。我发现 php 脚本被 Linux 杀死了。
So, I added memory logging calls into my script before and after each of the key areas of my script:
因此,我在脚本的每个关键区域之前和之后将内存记录调用添加到我的脚本中:
Log::Info('Memory now at: ' . memory_get_peak_usage());
Then I ran the script while watching the log output and also the output of the 'top' command.
然后我在查看日志输出以及“top”命令的输出的同时运行脚本。
I found that even though my methods were ending and the variables were going out of scope, the memory was not being freed.
我发现即使我的方法结束并且变量超出范围,内存也没有被释放。
Things that I tried, that DIDNTmake any difference in my case:
的事情,我想,那DIDNT使我的情况下,任何的区别:
- unset($varname) on variables after I have finished with them - hoping to get GC to kick off
- adding gc_enable() at beginning of script and then adding gc_collect_cycle() calls after a significant number of vars are unset.
- Disabling mysql transactions - thinking maybe that is memory intensive - it wasnt.
- 在我完成变量后 unset($varname) - 希望让 GC 开始
- 在脚本开头添加 gc_enable() ,然后在大量变量未设置后添加 gc_collect_cycle() 调用。
- 禁用 mysql 事务 - 认为这可能是内存密集型 - 事实并非如此。
Now, the odd thing was that none of the above made any difference. My script was still using 150mb or ram by time it killed!
现在,奇怪的是以上没有任何区别。我的脚本在它被杀死时仍在使用 150mb 或 ram!
The solution that actually worked:
实际有效的解决方案:
Now this is definitely a laravel specific solution. But my scripts purpose is basically to parse a large xml feed and then insert thousands of rows into mysql using Elequent ORM.
现在这绝对是一个 Laravel 特定的解决方案。但我的脚本目的基本上是解析一个大型 xml 提要,然后使用 Elequent ORM 将数千行插入到 mysql 中。
It turns out that Laravel creates logging information and objects to help you see the query performance.
事实证明,Laravel 创建日志信息和对象来帮助您查看查询性能。
By turning this off with the following 'magic' call, I got my script down from 150mb to around 20mb!
通过使用以下“魔术”调用关闭此功能,我将脚本从 150mb 减少到 20mb 左右!
This is the 'magic;' call:
这就是“魔法”;称呼:
DB::connection()->disableQueryLog();
I can tell you by the time I found this call, I was grasping at straws ;-(
我可以告诉你,当我发现这个电话时,我正在抓紧稻草;-(
回答by George Cummins
A process may be killed for several reasons:
一个进程可能因以下几个原因被杀死:
Out of Memory
内存不足
There are two ways to trigger this error: Exceed the amount of memory allocated to PHP script in php.ini, or exceed the available system memory. Check the PHP error log and php.ini file to rule out the first possibility, and use dmesg output to check for the second possibility.
触发此错误的方法有两种: 超过 php.ini 中分配给 PHP 脚本的内存量,或超过可用的系统内存。检查PHP错误日志和php.ini文件排除第一种可能,使用dmesg输出检查第二种可能。
Exceeded the execution time-out limit
超过执行超时限制
In your post you indicate that you disabled the timeout via the max_execution_time
setting, but I have included it here for completeness. Be sure that the setting in php.ini is correct and (for those using a web server instead of a CLI script) restart the web server to ensure that the new configuration is active.
在您的帖子中,您表示您通过max_execution_time
设置禁用了超时,但为了完整起见,我将其包含在此处。确保 php.ini 中的设置正确,并且(对于那些使用 Web 服务器而不是 CLI 脚本的人)重新启动 Web 服务器以确保新配置处于活动状态。
An error in the stack
堆栈中的错误
If your script is error-free and not encountering either of the above errors, ensure that your system is running as expected. When using a web server, restart the web server software. Check the error logs for unexpected output, and stop or upgrade related daemons and needed.
如果您的脚本没有错误并且没有遇到上述任何一个错误,请确保您的系统按预期运行。使用网络服务器时,请重新启动网络服务器软件。检查错误日志是否有意外输出,并停止或升级相关的守护进程和需要。
回答by Fujisan
Had this issue on a Laravel/Spark project. just wanted to share if others have this issue.
在 Laravel/Spark 项目中遇到了这个问题。只是想分享如果其他人有这个问题。
Try a refresh/restart of your dev server if running Vagrant or Ubuntu before more aggressive approaches.
如果在更激进的方法之前运行 Vagrant 或 Ubuntu,请尝试刷新/重新启动您的开发服务器。
I accidentally ran install of dependency packages on a Vagrant server. I also removed and replaced a mirrored folder repeatedly during install errors. My error was on Laravel/Spark 4.~. I was able to run migrations on other projects; I was getting 'killed' very quickly, 300ms timeframe, on a particular project for nearly all commands. Reading other users, I was dreading trying to find the issue or corruption. In my case, a quick Vagrant reload did the trick. killed issue was resolved.
我不小心在 Vagrant 服务器上安装了依赖包。我还在安装错误期间反复删除并替换了镜像文件夹。我的错误是在 Laravel/Spark 4.~。我能够在其他项目上运行迁移;对于几乎所有命令的特定项目,我很快就被“杀死”了,300 毫秒的时间范围。阅读其他用户,我很害怕试图找到问题或损坏。在我的例子中,快速的 Vagrant 重新加载成功了。被杀死的问题得到解决。