Git push - suboptimal pack - out of memory
Solution 1
- May be git is suboptimal tool for handling large amount of big blobs.
- You can disable multi-threaded compression to save memory:
git config pack.threads 1
(in addition to other memory limiting options, likecore.bigfilethreshold
in newer Git)
Solution 2
The following command fixed the issue for me:
git config --global pack.windowMemory 256m
This affects effectiveness of delta compression so you might want to try a bigger size first, something like 1g, depending on your hardware and bandwidth.
More details here: https://www.kernel.org/pub/software/scm/git/docs/git-pack-objects.html
Solution 3
git config --global pack.threads 1
Solution 4
I had the same issue with a git clone. The repo was 25GB. I used an alternative command, for me it required root control of the source,
rsync -avz -e ssh --progress user@computerName:repo/Directory destination/folder
after this I was able to commit and pull just like any other repository.
Solution 5
In my case I had previously reduced my server's virtual memory to nothing in order to remove the paging file so that I could free up the partition and increase the size of my main partition. This had the effect of reducing my working memory, and the result was that git was unable to process large files. After increasing my virtual memory again all was sorted.
Related videos on Youtube
Comments
-
Skittles almost 4 years
I could really use some help here.
I just created a new bare repo to act as a production target for dev pushes. I also have the working web directory on the server as a git repo. The server is running git 1.7.4.1 on centos5.5
After creating the new repo in the web directory, I performed a git add . It tallied up something like 2300 & some odd files & over 230k insertions.
I did a commit of the newly added file base. Went nice and clean. When I did a git push origin master though, it keeps giving me this (please note, I have 8 CPUs, hence the 8 threads. docs say this is normal);
# git push --mirror Counting objects: 2000, done. Delta compression using up to 8 threads. warning: suboptimal pack - out of memory fatal: inflateInit: out of memory (no message) error: failed to push some refs to '/home/ggadmin/gg-prod.git'
I have tried the following things to resolve this, but all yield the same results;
git repack -adf --window-memory=100m ^ tried running this up to 1024m. Same result.
Even tried a force push, but got the same thing, only with a malloc error;
# git push -f origin master Counting objects: 2000, done. Delta compression using up to 8 threads. warning: suboptimal pack - out of memory fatal: Out of memory, malloc failed (tried to allocate 2340 bytes) error: failed to push some refs to '/home/ggadmin/gg-prod.git'
I've been working on this for 2 days now and tried just about everything I can find on google and here on SO.
I have reached my wits end with trying to get this fixed. Please tell me that someone out there knows what can be done to make this work?
-
VonC about 12 yearsJust to be sure, this has nothing to do with the
postBuffer
? stackoverflow.com/questions/6842687/… -
Skittles about 12 yearsPlease explain what you mean, VonC as that is a new term for me with respect to Git.
-
VonC about 12 yearsI was wondering if
git config --global http.postBuffer 524288000
wouldn't be able to make your push work. -
Skittles about 12 yearsI can certainly try that. I'm currently at my office, so I'll have to wait until I get home to see if that works. Thanks, VonC! :)
-
-
Skittles about 12 yearsWell Vi...git is running slower than molasses running down a pipe in an arctic summer, but it worked. Thank you!
-
Vi. about 12 yearsYou can consider externalizing big things out of git repo (while still versioning them), or using some other approach for the task. Git is probably trying to find similar blocks in all your data. Try adjusting
core.bigfilethreshold
option (git >= v1.7.6) -
Skittles about 12 yearsThanks again, Vi! Unfortunately, I'm using v1.7.4.1. But I'll keep that at the top of my Git knowledge items.
-
Admin almost 9 yearsThis is the only thing that worked for me. Urgh, I hate still having to deal with old shared servers.