GitLab CE Git LFS failing to push large repo due to unsupported Content-Type header(s)
Summary
Cannot push to a large repository (around ~3800 files, totalling ~1.6GB) with Git LFS.
Steps to reproduce
- Start a new Docker GitLab CE (with Git LFS enabled) via:
docker run --detach \
--hostname localhost \
--env GITLAB_OMNIBUS_CONFIG="external_url 'http://localhost:8888'; gitlab_rails['gitlab_shell_ssh_port'] = 2222; gitlab_rails['lfs_enabled'] = true; nginx['client_max_body_size'] = '512m';" \
--publish 4443:443 --publish 8888:8888 --publish 2222:2222 \
--name gitlab \
--restart always \
--volume /Users/Seb/workspace/docker/gitlab/config:/etc/gitlab \
--volume /Users/Seb/workspace/docker/gitlab/logs:/var/log/gitlab \
--volume /Users/Seb/workspace/docker/gitlab/data:/var/opt/gitlab \
gitlab/gitlab-ce:latest
- After it starts up, access http://localhost:8888 and create a new Git repo.
- Create a new Git repo on your machine and add a folder called
larger-folder
with around 3800 files with no file extension, which adds up to a total of around 1.6GB. - Create a
.gitattributes
with the following content:
larger-folder/* filter=lfs diff=lfs merge=lfs -text
- Commit, setup remote and push.
What is the current bug behaviour?
An error is shown: Uploading failed due to unsupported Content-Type header(s).
git lfs push --all origin master
info: Uploading failed due to unsupported Content-Type header(s).
info: Consider disabling Content-Type detection with:
info:
info: $ git config lfs.contenttype false
Uploading LFS objects: 90% (3375/3747), 1.6 GB | 1.7 KB/s, done
Running:
git config lfs.contenttype false
And then trying again results in the same error message.
What is the expected correct behaviour?
Git LFS files should be successfully pushed to remote Docker instance.