Git push - suboptimal pack - out of memory

32,449

Solution 1

  1. May be git is suboptimal tool for handling large amount of big blobs.
  2. You can disable multi-threaded compression to save memory: git config pack.threads 1 (in addition to other memory limiting options, like core.bigfilethreshold in newer Git)

Solution 2

The following command fixed the issue for me:

git config --global pack.windowMemory 256m

This affects effectiveness of delta compression so you might want to try a bigger size first, something like 1g, depending on your hardware and bandwidth.

More details here: https://www.kernel.org/pub/software/scm/git/docs/git-pack-objects.html

Solution 3

git config --global pack.threads 1

Solution 4

I had the same issue with a git clone. The repo was 25GB. I used an alternative command, for me it required root control of the source,

rsync -avz -e ssh --progress user@computerName:repo/Directory destination/folder

after this I was able to commit and pull just like any other repository.

Solution 5

In my case I had previously reduced my server's virtual memory to nothing in order to remove the paging file so that I could free up the partition and increase the size of my main partition. This had the effect of reducing my working memory, and the result was that git was unable to process large files. After increasing my virtual memory again all was sorted.

Share:
32,449

Related videos on Youtube

Skittles
Author by

Skittles

if [ problem -gt solution ]; then rm -rf / fi

Updated on July 29, 2020

Comments

  • Skittles
    Skittles almost 4 years

    I could really use some help here.

    I just created a new bare repo to act as a production target for dev pushes. I also have the working web directory on the server as a git repo. The server is running git 1.7.4.1 on centos5.5

    After creating the new repo in the web directory, I performed a git add . It tallied up something like 2300 & some odd files & over 230k insertions.

    I did a commit of the newly added file base. Went nice and clean. When I did a git push origin master though, it keeps giving me this (please note, I have 8 CPUs, hence the 8 threads. docs say this is normal);

    # git push --mirror
    Counting objects: 2000, done.
    Delta compression using up to 8 threads.
    warning: suboptimal pack - out of memory
    fatal: inflateInit: out of memory (no message)
    error: failed to push some refs to '/home/ggadmin/gg-prod.git'
    

    I have tried the following things to resolve this, but all yield the same results;

    git repack -adf --window-memory=100m
                                    ^ tried running this up to 1024m. Same result.
    

    Even tried a force push, but got the same thing, only with a malloc error;

    # git push -f origin master
    Counting objects: 2000, done.
    Delta compression using up to 8 threads.
    warning: suboptimal pack - out of memory
    fatal: Out of memory, malloc failed (tried to allocate 2340 bytes)
    error: failed to push some refs to '/home/ggadmin/gg-prod.git'
    

    I've been working on this for 2 days now and tried just about everything I can find on google and here on SO.

    I have reached my wits end with trying to get this fixed. Please tell me that someone out there knows what can be done to make this work?

    • VonC
      VonC about 12 years
      Just to be sure, this has nothing to do with the postBuffer? stackoverflow.com/questions/6842687/…
    • Skittles
      Skittles about 12 years
      Please explain what you mean, VonC as that is a new term for me with respect to Git.
    • VonC
      VonC about 12 years
      I was wondering if git config --global http.postBuffer 524288000 wouldn't be able to make your push work.
    • Skittles
      Skittles about 12 years
      I can certainly try that. I'm currently at my office, so I'll have to wait until I get home to see if that works. Thanks, VonC! :)
  • Skittles
    Skittles about 12 years
    Well Vi...git is running slower than molasses running down a pipe in an arctic summer, but it worked. Thank you!
  • Vi.
    Vi. about 12 years
    You can consider externalizing big things out of git repo (while still versioning them), or using some other approach for the task. Git is probably trying to find similar blocks in all your data. Try adjusting core.bigfilethreshold option (git >= v1.7.6)
  • Skittles
    Skittles about 12 years
    Thanks again, Vi! Unfortunately, I'm using v1.7.4.1. But I'll keep that at the top of my Git knowledge items.
  • Admin
    Admin almost 9 years
    This is the only thing that worked for me. Urgh, I hate still having to deal with old shared servers.