Skip to content

Imports larger than 100mb fail with `entity is too large`

Summary

Importing any archive larger than 100MB fails with:

handleFileUploads: extract files from multipart: persisting multipart file: PUT request "https://storage.googleapis.com/gitlab-gprd-uploads/tmp/uploads/1607371065-18626-0016-0230-bc232d5f7cc53853d4db2af240c88140": Put "https://storage.googleapis.com/gitlab-gprd-uploads/tmp/uploads/1607371065-18626-0016-0230-bc232d5f7cc53853d4db2af240c88140[..]": entity is too large

The above is an actual 500 by Workhorse, but will actually be handled by gitlab-workhorse#328 (closed) and return 413 instead, though the problem still seems to be that the import is too large.

In the docs we mention that the maximum import size is 5GB, but anything over 100MB fails here.

Steps to reproduce

  1. Create an empty project
  2. Create a 101mb file and push it to the project: dd if=/dev/random of=101mb.iso bs=1M count=101
  3. Export the project (final archive should be ~101MB or larger)
  4. Attempt to import the project, notice it fails with a 500 (following Kibana, the entity is too large above)

References