Import data from the existing container registry into the new one in pre-prod
Related to gitlab-org&5392 (closed).
As we approach the deployment of the new registry with a metadata database and online garbage collection to pre-production, we'll need to import the metadata that exists in the current registry bucket into the new metadata database and copy the non-dangling blobs from the old bucket to the new one (https://gitlab.com/gitlab-com/gl-infra/infrastructure/-/issues/12763).
The container registry CLI was extended with an import
command which should be used to perform this operation, both for the metadata and blobs, using a single command (documentation). This is a non-destructive procedure, so the existing bucket remains intact.
Contrary to production, the pre-production registry is tiny (under 1GB, I believe) which means that we can perform the import in a single operation, and it should only take a couple of minutes. The registry should be set to readonly mode during this procedure.
The CLI import
command can be invoked from either one of the registry pods, a separate ephemeral container (overriding the default serve
container CMD
), or from another machine using the registry binary.
Requirements
-
Set registry to readonly mode (or remove it from the load balancer); -
Run import in dry-run mode with command ./registry database import --blob-transfer-destination <name of the new GCS bucket> --dry-run path/to/config.yml
. The configuration file must be the same used for the registry pods; -
Rerun the import without --dry-run
; -
Reconfigure the registry to: -
Remove the read-only mode ( storage.maintenance.readonly.enabled: false
); -
Enable the metadata database ( database.enabled: true
); -
Disable filesystem metadata ( migration.disablemirrorfs: true
); -
Use the new storage bucket instead of the old one ( storage.gcs.bucket: <name of the new GCS bucket>
);
-
-
Test the import by pulling one of the images that existed in the old bucket.