Git – repository corruption error

Sometimes when you have really big project you can encounter “corruption” error. This is not always ‘true’ corruption error when you are losing data etc. It is just a time out alike error when git is unable to complete a task in given time because of the size of repository. This is usualy because of to big numbers in configuration for ‘windowsMemory’ and ‘SizeLimit’ parameters.

This error can look like this:

Cloning into 'test_repo'...
    fatal: git upload-pack: aborting due to possible repository corruption on the remote side.
    fatal: early EOF:  41%
    emote: aborting due to possible repository corruption on the remote side.
    fatal: index-pack failed

To ensure this is not ‘true’ corruption error you can fsck git repo

# git fsck
Checking object directories: 100% (256/256), done.
Checking objects: 100% (2218/2218), done.
dangling commit 5ae478cea3aa6f42cc8fe865c9fc26b35ea9e15d
dangling commit b657b57b65f6fc4ffea1c25c77ff62c94471d41a
dangling commit 1c9ef0ff781f812f506ca1d18ef4af4a90a4938d

To overcome this we can set those variables:

git config --global pack.windowMemory "100m"
git config --global pack.SizeLimit "100m" 
git config --global pack.threads "1"
git config --global pack.window "0"

 

GitLab solution:

This problem first time hit me on GitLab with big private repo, and the above command wont help because gitlab use rb files to rewrite configs each time.

#vim /etc/gitlab/gitlab.rv
omnibus_gitconfig['system'] = { "pack" => ["windowMemory = 100m", "packSizeLimit = 100m", "threads = 1", "window = 0"]}

And we need to: gitlab-ctl reconfigure

Leave a Reply

Your email address will not be published.

*