branching and merging - How can I recover from "fatal: Out of memory? mmap failed: Cannot allocate memory" in Git? -


let me start context:

i had upgrade crucial magento webshop new version. sure existing code still work after upgrade , make post-upgrade changes made git repository entire magento installation (excluding obvious content 4.5gb of images, ./var directory etc.), pushed origin , cloned on dev server. made new branch, performed upgrades, made code changes, committed dev branch , pushed origin.

now time has come upgrade 'real' shop, meaning have merge master branch on production server dev branch. , everyhing goes wrong:

git fetch - works

git branch says: * master

git merge origin/dev goes horribly wrong (only output after waiting):

fatal: out of memory? mmap failed: cannot allocate memory

same goes git checkout dev, git rebase master origin/dev etc.

did research here on stackoverflow in existing questions , spent evening of trying suggestions, including (but not limited to):

git gc  counting objects: 48154, done. delta compression using 2 threads. compressing objects: 100% (37152/37152), done. fatal: out of memory, malloc failed (tried allocate 527338875 bytes) error: failed run repack 

and:

git repack -a -d --window-memory 10m --max-pack-size 20m  counting objects: 48154, done. delta compression using 2 threads. compressing objects: 100% (37152/37152), done. fatal: out of memory, malloc failed (tried allocate 527338875 bytes) 

in addition previous command, tried this (which pretty similar). link makes mention of possible issue 32-bit systems, perhaps it's wise mention specs 3 systems involved:

  • 'dev' server: x86_64 gentoo 2.6.38-hardened-r6 // 4 cores & 8gb ram
  • 'origin' server: x86_64 gentoo 2.6.38-hardened-r6 // 2 cores & 4gb ram
  • 'live' server: x86_64 debian 4.3.2-1.1 2.6.35.5-pv1amd64 // (vps) 2 cores & 3gb ram

does know how can recover this? repacking on origin work? if does, how can convince production server fetch new copy of repository? appreciated!

the error you're getting comes large files in repository. git trying put entire contents of file in memory, makes croak.

try upgrading git

git 1.7.6 released last month , has lovely bit in release notes:

adding file larger core.bigfilethreshold (defaults 1/2 gig) using "git add" send contents straight packfile without having hold , compressed representation both @ same time in memory.

upgrading 1.7.6 might enable run git gc , maybe git merge, can't verify because it's hard repository state (the conditions must right).

try removing offending files

if upgrading git doesn't help, can try removing large files repository using git filter-branch. before that, try backing large files using git cat-file -p <commit_sha1>:path/to/large/file >/path/to/backup/of/large/file.

you'll want these operations on beefy machine (lots of memory).

if works, try re-cloning other machines (or rsync .git directory).


Comments

Popular posts from this blog

linux - Using a Cron Job to check if my mod_wsgi / apache server is running and restart -

actionscript 3 - TweenLite does not work with object -

jQuery Ajax Render Fragments OR Whole Page -