Project import from S3 remote object storage does not work

Overview

This issue started by testing the project import from remote object storage feature. AWS S3 was chosen for remote object storage.

When trying to import a project export archive, uploaded to an AWS S3 bucket, to gitlab.com, it returns the following response:

400 Bad Request
cloudflare

This applies to both the API by using Insomnia and curl.

  • The feature flag is enabled on gitlab.com by default.
  • This has also been tested on an on-prem GitLab instance and enabling the FF (nginx instead of cloudflare)
irb(main):007:0> Feature.all.map {|f| [f.name, f.state]}
=> [["multiple_merge_request_assignees", :on], [:import_project_from_remote_file, :on]]

Which remote object storage provider has this feature been tested on?

Details

  • The pre-signed URL is generated locally, on a MacOS, using the brew aws library:
$ aws s3 presign s3://import-from-remote-object-storage/2021-06-29_17-32-723_pmm-demo_test-project_export.tar.gz
...
<presigned_url_output>
  • The following JSON object is POSTed to the API endpoint https://gitlab.com/api/v4/projects/remote-import
{
	"url":"<presigned_url_output>",
	"path":"pprokic-test-remote-project",
	"name": "pprokic Test Remote Project",
	"namespace": "pprokic"
}
  • With headers
{
  Content-Type: application/json,
  Private-Token: <token>,
  Content-Length: application/gzip
}
  • By following the doc example you would receive, even when adding the 2 missing headers ContentType and ContentLength:
{"error":"url is missing, path is missing"}
  • When fixing the 2 headers to Content-Type and Content-Length one gets:
<html>
<head><title>400 Bad Request</title></head>
<body>
<center><h1>400 Bad Request</h1></center>
<hr><center>cloudflare</center>
</body>
</html>

or nginx in case of self-managed

Proposed Solution

Edited by Kassio Borges