Unicorn times out when many large binaries are checked in
ZD: https://gitlab.zendesk.com/agent/tickets/91658
Problem: Customer checked in a commit with a 20 GB file and hundreds of 500 MB files, all generated using Microsoft fsutil
. Upon visiting that commit in GitLab UI, that commit times out after 5 minutes (unicorn timeout was raised) and gobbles up more than 6 GB RAM.
We should be able to test this and see if there's a way we can skip large blobs and not load them into memory. This is a case for LFS.
From the strace, I also see some slowness in loading many files due to https://github.com/libgit2/libgit2/issues/4460, which has been fixed but not released.
/cc: @dblessing, @jwoods06, @smcgivern, @victorwu
Edited by Stan Hu