Git push - 次优包 - 内存不足

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/9561835/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-10 13:07:27  来源:igfitidea点击:

Git push - suboptimal pack - out of memory

gitmemorygit-push

提问by Skittles

I could really use some help here.

我真的可以在这里使用一些帮助。

I just created a new bare repo to act as a production target for dev pushes. I also have the working web directory on the server as a git repo. The server is running git 1.7.4.1 on centos5.5

我刚刚创建了一个新的裸仓库作为开发推送的生产目标。我还在服务器上将工作 Web 目录作为 git 存储库。服务器在 centos5.5 上运行 git 1.7.4.1

After creating the new repo in the web directory, I performed a git add . It tallied up something like 2300 & some odd files & over 230k insertions.

在 web 目录中创建新的 repo 后,我执行了 git add 。它统计了 2300 个和一些奇怪的文件以及超过 230k 的插入。

I did a commit of the newly added file base. Went nice and clean. When I did a git push origin master though, it keeps giving me this (please note, I have 8 CPUs, hence the 8 threads. docs say this is normal);

我提交了新添加的文件库。很干净。但是,当我执行 git push origin master 时,它一直给我这个(请注意,我有 8 个 CPU,因此有 8 个线程。文档说这是正常的);

# git push --mirror
Counting objects: 2000, done.
Delta compression using up to 8 threads.
warning: suboptimal pack - out of memory
fatal: inflateInit: out of memory (no message)
error: failed to push some refs to '/home/ggadmin/gg-prod.git'

I have tried the following things to resolve this, but all yield the same results;

我尝试了以下方法来解决这个问题,但都产生了相同的结果;

git repack -adf --window-memory=100m
                                ^ tried running this up to 1024m. Same result.

Even tried a force push, but got the same thing, only with a malloc error;

甚至尝试了强制推送,但得到了同样的结果,只是有一个 malloc 错误;

# git push -f origin master
Counting objects: 2000, done.
Delta compression using up to 8 threads.
warning: suboptimal pack - out of memory
fatal: Out of memory, malloc failed (tried to allocate 2340 bytes)
error: failed to push some refs to '/home/ggadmin/gg-prod.git'

I've been working on this for 2 days now and tried just about everything I can find on google and here on SO.

我已经为此工作了 2 天,并尝试了我可以在 google 和 SO 上找到的所有内容。

I have reached my wits end with trying to get this fixed. Please tell me that someone out there knows what can be done to make this work?

我已经尽力解决这个问题了。请告诉我,那里有人知道可以做些什么来使这项工作?

采纳答案by Vi.

  1. May be git is suboptimal tool for handling large amount of big blobs.
  2. You can disable multi-threaded compression to save memory: git config pack.threads 1(in addition to other memory limiting options, like core.bigfilethresholdin newer Git)
  1. 可能是 git 是处理大量大 blob 的次优工具。
  2. 您可以禁用多线程压缩以节省内存:(git config pack.threads 1除了其他内存限制选项,例如core.bigfilethreshold在较新的 Git 中)

回答by keremispirli

The following command fixed the issue for me:

以下命令为我解决了这个问题:

git config --global pack.windowMemory 256m

This affects effectiveness of delta compression so you might want to try a bigger size first, something like 1g, depending on your hardware and bandwidth.

这会影响增量压缩的有效性,因此您可能需要先尝试更大的尺寸,例如 1g,具体取决于您的硬件和带宽。

More details here: https://www.kernel.org/pub/software/scm/git/docs/git-pack-objects.html

更多细节在这里:https: //www.kernel.org/pub/software/scm/git/docs/git-pack-objects.html

回答by miguelbemartin

git config --global pack.threads 1

回答by Ashitakalax

I had the same issue with a git clone. The repo was 25GB. I used an alternative command, for me it required root control of the source,

我对 git clone 有同样的问题。回购是 25GB。我使用了一个替代命令,对我来说它需要对源进行根控制,

rsync -avz -e ssh --progress user@computerName:repo/Directory destination/folder

after this I was able to commit and pull just like any other repository.

在此之后,我能够像任何其他存储库一样提交和拉取。

回答by MrJedi2U

In my case I had previously reduced my server's virtual memory to nothing in order to remove the paging file so that I could free up the partition and increase the size of my main partition. This had the effect of reducing my working memory, and the result was that git was unable to process large files. After increasing my virtual memory again all was sorted.

就我而言,我之前已将服务器的虚拟内存减少到零以删除分页文件,以便我可以释放分区并增加主分区的大小。这会减少我的工作内存,结果是 git 无法处理大文件。再次增加我的虚拟内存后,一切都被排序了。

回答by Vitaly Zdanevich

None of this answers helped me. My problem was that my little server has 1gb of RAM and no SWAP. I maked sudo service apache2 stop& sudo service mysql stop+ kill one unused process from htop(after all of that I get ~100mb of RAM) and git pushcorrect.

这些答案都没有帮助我。我的问题是我的小服务器有 1gb 的 RAM 并且没有 SWAP。我让sudo service apache2 stop& sudo service mysql stop+ 杀死了一个未使用的进程htop(毕竟我得到了大约 100MB 的 RAM)并git push更正了。

回答by cucu8

I realise this is a bit late in the game but since some of the above helped me (thanks @Ashitakalax), here are my two cents. Same problem as above (inflateInit: out of memory) when moving changes from a Wordpress dev instance upstream to test, git aborts with out of memory and this is regularly due to changes on the ../uploads/ directory holding image files. All of this in a shared host with no access to the global git config so we do:

我意识到这在游戏中有点晚了,但由于上面的一些帮助了我(感谢@Ashitakalax),这是我的两分钱。与上述相同的问题(inflateInit:内存不足),当将更改从 Wordpress 开发实例上游移动到测试时,git 因内存不足而中止,这通常是由于 ../uploads/ 保存图像文件的目录发生更改。所有这些都在一个无法访问全局 git 配置的共享主机中,所以我们这样做:

0- in repo: git commit -m "some relevant details"

to record the changes

记录更改

1- rsync -av --progress repo/wp-content/uploads/ test/wp-content/uploads

to move the bulk of the image fixes/changes

移动大部分图像修复/更改

2- in test: git add -A

to track the new stuff on the test side of things

在事物的测试方面跟踪新事物

3- in test: git fetch origin

now fetch the rest from the repo

现在从 repo 中获取其余部分

4- in test: git merge origin/master

and finally merge...

最后合并...

The rsync bit lightens git load and all's well.

rsync 位减轻了 git 负载,一切都很好。