Files in `pages_deployments` are not deleted on disk when `deactivated_pages_deployments_delete_cron_worker` runs
Summary
Files in pages_deployments
are not deleted on disk when deactivated_pages_deployments_delete_cron_worker
runs
Steps to reproduce
- Set up a project with GitLab Pages and a
pages
job. The artifact expiration can be set to e.g. 4h, but this does not seem to be relevant to the problem at hand. - Run a pipeline which triggers the
pages
and subsequentlypages:deploy
job. Theartifacts.zip
file is stored on the GitLab server in a subdirectory similar to/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments/6500
. This also creates a record in thepages_deployments
table in the GitLab database. - Run a pipeline which triggers the same CI jobs. The previous
pages_deployments
job is then marked asdeleted_at
30 minutes after the job completion.
Example Project
N/A; seeing this on an on-premise GitLab instance.
What is the current bug behavior?
30-40 minutes after the new job has executed, the deactivated_pages_deployments_delete_cron_worker
cron job deletes the pages_deployments
record, but the associated artifacts.zip
file remains on disk.
What is the expected correct behavior?
30-40 minutes after the new job has executed, the deactivated_pages_deployments_delete_cron_worker
cron job should delete both the pages_deployments
record and the associated artifacts.zip
file on disk.
Relevant logs and/or screenshots
Background
We have been seeing a large, linear increase in disk usage on our GitLab instance lately, which caused alerts in our monitoring to be fired, which is why we started investigating this. Here is the /var/opt/gitlab
volume usage over time (/var/opt/gitlab/backups
is on a separate volume):
Interestingly enough, the beginning of this problem closely coincides with our upgrade to GitLab 16.5.1. Quoting /var/log/apt/history.log
:
Start-Date: 2023-11-01 08:24:12
Commandline: apt-get dist-upgrade
Requested-By: slovdahl (1001)
Upgrade: gitlab-ee:amd64 (16.4.1-ee.0, 16.5.1-ee.0)
End-Date: 2023-11-01 08:31:10
I saw that the /var/opt/gitlab/gitlab-rails/shared/pages
directory was using about 72GB on disk. By looking at historical data from our backups, we concluded that the pages.tar.gz
(which is the backup representation of this directory) was as small as 1.2GB some week before the upgrade. We hadn't done any GitLab Pages-related changes during this period that I was aware of.
We also saw that directories created before the upgrade (6398
and 6399
) were very small. The 6400
directory was created some hour(s) before the upgrade, and seemed much bigger than the other ones. All directories from 6401
and onwards are in the size around 621MB.
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# du -csh 6398 6399 6400 6401
4.0K 6398
4.0K 6399
582M 6400
621M 6401
1.2G total
(It turned out that these directories like 6398
and 6399
are completely empty; they do not contain any artifacts.zip
files. 6400
and 6401
contains artifacts.zip
files, despite these pages deployments being superseded a long time ago.)
To make a long story short, I've debugged this thoroughly, and also gotten some feedback from a GitLab Support Engineer in this ZenDesk issue (GitLab internal link), and I believe I finally found the problem. The ZenDesk issue contains more details about the initial steps of the debugging, but I'll try to summarize the most important parts from the final steps below
TL;DR: I believe !132252 (merged) breaks the deletion of artifacts.zip
files on disk. Here's how I came to that conclusion:
Disabling the "new" cleanup worker
I wanted to disable the deactivated_pages_deployments_delete_cron_worker
job, so I could more freely disable this in the Rails console.
(The first disabling of geo_sidekiq_cron_config_worker
is to workaround #37135, since it would otherwise automatically reenable the deactivated_pages_deployments_delete_cron_worker
job every minute, which would interfere with the debugging.)
irb(main):012:0> Sidekiq::Cron::Job.all.select{ |j| j.name.match("geo_sidekiq_cron_config_worker") }.first.disable!
=> 130
irb(main):007:0> Sidekiq::Cron::Job.all.select{ |j| j.name.match("deactivated_pages_deployments_delete_cron_worker") }.first.disable!
=> 148
Running the "old" cleanup code manually in the Rails Console
I then copied the code from DestroyPagesDeploymentsWorker
and DestroyDeploymentsService
, together with some reading of !132252 (diffs) to understand what PagesDeployment
ID to pass in to the older_than()
method. I ran this manually in the Rails Console:
irb(main):018:0> project = Project.find_by_id(2)
=> #<Project id:2 hiboxsystems/hiboxcentre>>
irb(main):019:0> project.pages_deployments
=>
[#<PagesDeployment:0x00007f7cd2fb97e8
id: 6548,
created_at: Wed, 22 Nov 2023 07:18:29.410214000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:18:29.410214000 UTC +00:00,
project_id: 2,
ci_build_id: 1714179,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: nil,
verification_checksum: nil>,
#<PagesDeployment:0x00007f7cd30d30e8
id: 6547,
created_at: Wed, 22 Nov 2023 04:16:28.004843000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:18:31.876903000 UTC +00:00,
project_id: 2,
ci_build_id: 1714007,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: Wed, 22 Nov 2023 07:48:31.876841000 UTC +00:00,
verification_checksum: nil>]
irb(main):022:0> deployments_to_destroy = project.pages_deployments.older_than(6548)
=>
[#<PagesDeployment:0x00007f7ccbd7f740
...
irb(main):023:0> deployments_to_destroy.count
=> 1
irb(main):024:0> deployments_to_destroy.first
=>
#<PagesDeployment:0x00007f7ccbd7f740
id: 6547,
created_at: Wed, 22 Nov 2023 04:16:28.004843000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:18:31.876903000 UTC +00:00,
project_id: 2,
ci_build_id: 1714007,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: Wed, 22 Nov 2023 07:48:31.876841000 UTC +00:00,
verification_checksum: nil>
The files on disk right at this stage, before running the actual cleanup:
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# stat 6547
File: 6547
Size: 4096 Blocks: 8 IO Block: 4096 directory
Device: fd00h/64768d Inode: 10273134 Links: 2
Access: (0755/drwxr-xr-x) Uid: ( 997/ git) Gid: ( 998/ git)
Access: 2023-11-22 09:02:15.800239184 +0200
Modify: 2023-11-22 06:16:28.011461014 +0200
Change: 2023-11-22 06:16:28.011461014 +0200
Birth: 2023-11-22 06:16:28.011461014 +0200
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# ls -la 6547
total 636072
drwxr-xr-x 2 git git 4096 Nov 22 06:16 .
drwxr-xr-x 6538 git git 135168 Nov 22 09:18 ..
-rw-r--r-- 1 git git 651188868 Nov 22 06:16 artifacts.zip
Then running the "old" cleanup code:
irb(main):025:0> deployments_to_destroy.find_each(&:destroy)
=> nil
This time, the artifacts.zip
file for pages_deployment
with ID 6547
was indeed deleted on disk:
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# ls -la 6547
total 140
drwxr-xr-x 2 git git 4096 Nov 22 09:42 .
drwxr-xr-x 6538 git git 135168 Nov 22 09:18 ..
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# stat 6547
File: 6547
Size: 4096 Blocks: 8 IO Block: 4096 directory
Device: fd00h/64768d Inode: 10273134 Links: 2
Access: (0755/drwxr-xr-x) Uid: ( 997/ git) Gid: ( 998/ git)
Access: 2023-11-22 09:02:15.800239184 +0200
Modify: 2023-11-22 09:42:08.551663172 +0200
Change: 2023-11-22 09:42:08.551663172 +0200
Birth: 2023-11-22 06:16:28.011461014 +0200
...and re-requerying the project in the Rails console indicated that the pages_deployments
record with ID 6547
was now gone:
irb(main):027:0> project = Project.find_by_id(2)
=> #<Project id:2 hiboxsystems/hiboxcentre>>
irb(main):028:0> project.pages_deployments
=>
[#<PagesDeployment:0x00007f7cc34df5e8
id: 6548,
created_at: Wed, 22 Nov 2023 07:18:29.410214000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:18:29.410214000 UTC +00:00,
project_id: 2,
ci_build_id: 1714179,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: nil,
verification_checksum: nil>]
Running the "new" cleanup code manually in the Rails Console
Then, recreating a new Pages deployment in CI (to have a new PagesDeployment
/pages_deployments
record in the DB), and running the logic from the "new" cleanup job (deactivated_pages_deployments_delete_cron_worker
, Pages::DeactivatedDeploymentsDeleteCronWorker
- !132252 (diffs)). Here are the files on disk before running the cleanup:
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# ls -la 6548
total 636072
drwxr-xr-x 2 git git 4096 Nov 22 09:18 .
drwxr-xr-x 6539 git git 135168 Nov 22 09:56 ..
-rw-r--r-- 1 git git 651188868 Nov 22 09:18 artifacts.zip
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# stat 6548
File: 6548
Size: 4096 Blocks: 8 IO Block: 4096 directory
Device: fd00h/64768d Inode: 10273163 Links: 2
Access: (0755/drwxr-xr-x) Uid: ( 997/ git) Gid: ( 998/ git)
Access: 2023-11-22 09:18:41.255625840 +0200
Modify: 2023-11-22 09:18:29.827493177 +0200
Change: 2023-11-22 09:18:29.827493177 +0200
Birth: 2023-11-22 09:18:29.827493177 +0200
And here's what the PagesDeployment
records looked like:
irb(main):029:0> project = Project.find_by_id(2)
=> #<Project id:2 hiboxsystems/hiboxcentre>>
irb(main):030:0> project.pages_deployments
=>
[#<PagesDeployment:0x00007f7cb4b44320
id: 6549,
created_at: Wed, 22 Nov 2023 07:56:43.915757000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:56:43.915757000 UTC +00:00,
project_id: 2,
ci_build_id: 1714440,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: nil,
verification_checksum: nil>,
#<PagesDeployment:0x00007f7cb4b44140
id: 6548,
created_at: Wed, 22 Nov 2023 07:18:29.410214000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:56:46.283773000 UTC +00:00,
project_id: 2,
ci_build_id: 1714179,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: Wed, 22 Nov 2023 08:26:46.283723000 UTC +00:00,
verification_checksum: nil>]
The PagesDeployment.deactivated
data before the cleanup:
irb(main):031:0> PagesDeployment.deactivated
=>
[#<PagesDeployment:0x00007f7cb4ebf180
id: 6548,
created_at: Wed, 22 Nov 2023 07:18:29.410214000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 07:56:46.283773000 UTC +00:00,
project_id: 2,
ci_build_id: 1714179,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: Wed, 22 Nov 2023 08:26:46.283723000 UTC +00:00,
verification_checksum: nil>]
Then running the "new" cleanup code, and looking at the PagesDeployment.deactivated
data afterwards:
irb(main):032:1* PagesDeployment.deactivated.each_batch do |deployments|
irb(main):033:1* deployments.delete_all
irb(main):034:0> end
=> nil
irb(main):035:0> PagesDeployment.deactivated
=> []
The file on disk is still retained, however. This is the incorrect behavior; the file is expected to be deleted at this point.
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# ls -la 6548
total 636072
drwxr-xr-x 2 git git 4096 Nov 22 09:18 .
drwxr-xr-x 6539 git git 135168 Nov 22 09:56 ..
-rw-r--r-- 1 git git 651188868 Nov 22 09:18 artifacts.zip
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# stat 6548
File: 6548
Size: 4096 Blocks: 8 IO Block: 4096 directory
Device: fd00h/64768d Inode: 10273163 Links: 2
Access: (0755/drwxr-xr-x) Uid: ( 997/ git) Gid: ( 998/ git)
Access: 2023-11-22 09:18:41.255625840 +0200
Modify: 2023-11-22 09:18:29.827493177 +0200
Change: 2023-11-22 09:18:29.827493177 +0200
Birth: 2023-11-22 09:18:29.827493177 +0200
find_each(&:destroy)
approach
Modifying the "new" cleanup code to use the I then tried a new approach: modify the "new cleanup code" to use the find_each(&:destroy)
approach instead of delete_all
. But first I had to generate a new artifacts.zip
file again, re-running the pages
CI job once more. This time, PagesDeployment.deactivated
was empty, to my surprise:
irb(main):007:0> PagesDeployment.deactivated
=> []
...because the deleted_at
timestamp for ID 6549
has not yet passed (it's deleted 30 minutes after the new CI job is created, per the Projects::UpdatePagesService::OLD_DEPLOYMENTS_DESTRUCTION_DELAY
setting - the deleted_at
value below, 09:12:09). Local time at the time of writing was 11:06 +02:00, so 09:06 UTC.
irb(main):009:0> project = Project.find_by_id(2)
=> #<Project id:2 hiboxsystems/hiboxcentre>>
irb(main):010:0> project.pages_deployments
=>
[#<PagesDeployment:0x00007f10ca4d7ba0
id: 6550,
created_at: Wed, 22 Nov 2023 08:42:06.234620000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 08:42:06.234620000 UTC +00:00,
project_id: 2,
ci_build_id: 1714515,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: nil,
verification_checksum: nil>,
#<PagesDeployment:0x00007f10ca4d79e8
id: 6549,
created_at: Wed, 22 Nov 2023 07:56:43.915757000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 08:42:09.484018000 UTC +00:00,
project_id: 2,
ci_build_id: 1714440,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: Wed, 22 Nov 2023 09:12:09.483964000 UTC +00:00,
verification_checksum: nil>]
I updated the deleted_at
for the PagesDeployment
record manually, which would make it be included in the PagesDeployment.deactivated
data:
irb(main):013:0> project.pages_deployments[1].deleted_at = Time.now
=> 2023-11-22 11:07:40.910047545 +0200
irb(main):018:0> project.pages_deployments[1].save!
=> true
irb(main):019:0> PagesDeployment.deactivated
=>
[#<PagesDeployment:0x00007f10bd397e50
id: 6549,
created_at: Wed, 22 Nov 2023 07:56:43.915757000 UTC +00:00,
updated_at: Wed, 22 Nov 2023 09:08:15.429843000 UTC +00:00,
project_id: 2,
ci_build_id: 1714440,
file_store: 1,
file: "artifacts.zip",
file_count: 177560,
file_sha256: "86ebf31f86b1ba7c9f90f221c13278d77c862a2341fea848b1343feb5e0f3365",
size: 651188868,
root_directory: nil,
path_prefix: nil,
build_ref: nil,
deleted_at: Wed, 22 Nov 2023 09:08:10.775842000 UTC +00:00,
verification_checksum: nil>]
Files on disk before the cleanup (note, not immediately before since I forgot to run it, so this is from slightly earlier today):
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# ls -la 6549
total 636072
drwxr-xr-x 2 git git 4096 Nov 22 09:56 .
drwxr-xr-x 6539 git git 135168 Nov 22 09:56 ..
-rw-r--r-- 1 git git 651188868 Nov 22 09:56 artifacts.zip
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# stat 6549
File: 6549
Size: 4096 Blocks: 8 IO Block: 4096 directory
Device: fd00h/64768d Inode: 10273214 Links: 2
Access: (0755/drwxr-xr-x) Uid: ( 997/ git) Gid: ( 998/ git)
Access: 2023-11-22 10:26:09.237415508 +0200
Modify: 2023-11-22 09:56:43.929516352 +0200
Change: 2023-11-22 09:56:43.929516352 +0200
Birth: 2023-11-22 09:56:43.925516307 +0200
Running the "modified new cleanup" code:
irb(main):020:1* PagesDeployment.deactivated.each_batch do |deployments|
irb(main):021:1* deployments.find_each(&:destroy)
irb(main):022:0> end
=> nil
irb(main):023:0> PagesDeployment.deactivated
=> []
Looking at the filesystem after the cleanup indicates that this does indeed seem to fix the problem. The file on disk is now properly deleted, as it ought to be:
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# ls -la 6549
total 140
drwxr-xr-x 2 git git 4096 Nov 22 11:10 .
drwxr-xr-x 6540 git git 135168 Nov 22 10:42 ..
root@git:/var/opt/gitlab/gitlab-rails/shared/pages/@hashed/94/18/94184e731aff5bdc5541709117eeaa5f235b4f1ef214d8e18be1ca02904b6d92/pages_deployments# stat 6549
File: 6549
Size: 4096 Blocks: 8 IO Block: 4096 directory
Device: fd00h/64768d Inode: 10273214 Links: 2
Access: (0755/drwxr-xr-x) Uid: ( 997/ git) Gid: ( 998/ git)
Access: 2023-11-22 11:12:41.373442327 +0200
Modify: 2023-11-22 11:10:13.027740419 +0200
Change: 2023-11-22 11:10:13.027740419 +0200
Birth: 2023-11-22 09:56:43.925516307 +0200
Based on this, I have created an MR suggesting this fix to be merged into GitLab, for further discussion if this is indeed the right thing or not: !137645 (merged)
Output of checks
Results of GitLab environment info
Expand for output related to GitLab environment info
System information System: Ubuntu 22.04 Proxy: no Current User: git Using RVM: no Ruby Version: 3.0.6p216 Gem Version: 3.4.19 Bundler Version:2.4.20 Rake Version: 13.0.6 Redis Version: 7.0.13 Sidekiq Version:6.5.7 Go Version: unknown GitLab information Version: 16.5.1-ee Revision: 55da9ccb652 Directory: /opt/gitlab/embedded/service/gitlab-rails DB Adapter: PostgreSQL DB Version: 13.11 URL: https://git.example.com HTTP Clone URL: https://git.example.com/some-group/some-project.git SSH Clone URL: git@git.example.com:some-group/some-project.git Elasticsearch: no Geo: no Using LDAP: yes Using Omniauth: yes Omniauth Providers: GitLab Shell Version: 14.29.0 Repository storages: - default: unix:/var/opt/gitlab/gitaly/gitaly.socket GitLab Shell path: /opt/gitlab/embedded/service/gitlab-shell Gitaly - default Address: unix:/var/opt/gitlab/gitaly/gitaly.socket - default Version: 16.5.1 - default Git Version: 2.42.0
Results of GitLab application Check
Expand for output related to the GitLab application check
Checking GitLab subtasks ...Checking GitLab Shell ...
GitLab Shell: ... GitLab Shell version >= 14.29.0 ? ... OK (14.29.0) Running /opt/gitlab/embedded/service/gitlab-shell/bin/check Internal API available: OK Redis available via internal API: OK gitlab-shell self-check successful
Checking GitLab Shell ... Finished
Checking Gitaly ...
Gitaly: ... default ... OK
Checking Gitaly ... Finished
Checking Sidekiq ...
Sidekiq: ... Running? ... yes Number of Sidekiq processes (cluster/worker) ... 1/1
Checking Sidekiq ... Finished
Checking Incoming Email ...
Incoming Email: ... Reply by email is disabled in config/gitlab.yml
Checking Incoming Email ... Finished
Checking LDAP ...
LDAP: ... Server: ldapmain LDAP authentication... Anonymous. No
bind_dn
orpassword
configured LDAP users with access to your GitLab server (only showing the first 100 results) User output sanitized. Found 63 users of 100 limit.Checking LDAP ... Finished
Checking GitLab App ...
Database config exists? ... yes Tables are truncated? ... skipped All migrations up? ... yes Database contains orphaned GroupMembers? ... no GitLab config exists? ... yes GitLab config up to date? ... yes Cable config exists? ... yes Resque config exists? ... yes Log directory writable? ... yes Tmp directory writable? ... yes Uploads directory exists? ... yes Uploads directory has correct permissions? ... yes Uploads directory tmp has correct permissions? ... yes Systemd unit files or init script exist? ... skipped (omnibus-gitlab has neither init script nor systemd units) Systemd unit files or init script up-to-date? ... skipped (omnibus-gitlab has neither init script nor systemd units) Projects have namespace: ... 4/1 ... yes 4/2 ... yes 5/3 ... yes 5/4 ... yes 4/5 ... yes 6/6 ... yes 4/7 ... yes 6/8 ... yes 6/9 ... yes 4/10 ... yes 4/11 ... yes 4/12 ... yes 6/13 ... yes 4/14 ... yes 5/15 ... yes 4/16 ... yes 6/17 ... yes 4/18 ... yes 4/19 ... yes 4/20 ... yes 4/21 ... yes 66/22 ... yes 4/23 ... yes 6/24 ... yes 4/25 ... yes 4/26 ... yes 8/27 ... yes 10/28 ... yes 16/29 ... yes 18/30 ... yes 22/31 ... yes 23/32 ... yes 24/33 ... yes 29/34 ... yes 30/35 ... yes 34/36 ... yes 35/37 ... yes 37/38 ... yes 40/39 ... yes 41/40 ... yes 43/41 ... yes 45/42 ... yes 50/43 ... yes 45/44 ... yes 7/45 ... yes 4/46 ... yes 4/47 ... yes 45/51 ... yes 4/52 ... yes 45/53 ... yes 39/54 ... yes 45/55 ... yes 45/56 ... yes 41/57 ... yes 4/58 ... yes 54/59 ... yes 15/60 ... yes 24/61 ... yes 23/62 ... yes 56/63 ... yes 4/64 ... yes 4/65 ... yes 4/66 ... yes 4/67 ... yes 4/68 ... yes 51/71 ... yes 65/72 ... yes 61/73 ... yes 59/74 ... yes 4/75 ... yes 62/76 ... yes 4/77 ... yes 24/78 ... yes 4/79 ... yes 45/80 ... yes 56/81 ... yes 4/82 ... yes 66/83 ... yes 18/84 ... yes 41/85 ... yes 45/87 ... yes 4/88 ... yes 4/89 ... yes 4/90 ... yes 35/91 ... yes 18/92 ... yes 50/93 ... yes 41/94 ... yes 45/95 ... yes 5/96 ... yes 45/97 ... yes 68/98 ... yes 37/99 ... yes 4/100 ... yes 41/101 ... yes 16/102 ... yes 70/103 ... yes 45/104 ... yes 71/105 ... yes 69/106 ... yes 35/107 ... yes 66/108 ... yes 18/109 ... yes 8/110 ... yes 41/111 ... yes 62/112 ... yes 69/113 ... yes 62/114 ... yes 70/115 ... yes 45/116 ... yes 70/117 ... yes 41/119 ... yes 45/120 ... yes 74/121 ... yes 4/122 ... yes 4/123 ... yes 34/124 ... yes 4/125 ... yes 41/126 ... yes 69/127 ... yes 41/128 ... yes 68/129 ... yes 65/130 ... yes 75/131 ... yes 18/132 ... yes 69/133 ... yes 50/134 ... yes 69/135 ... yes 27/136 ... yes 69/137 ... yes 50/139 ... yes 4/140 ... yes 4/141 ... yes 18/142 ... yes 8/143 ... yes 45/144 ... yes 30/145 ... yes 77/146 ... yes 50/147 ... yes 61/148 ... yes 50/149 ... yes 4/150 ... yes 45/151 ... yes 45/152 ... yes 50/153 ... yes 4/154 ... yes 41/155 ... yes 18/156 ... yes 4/157 ... yes 41/158 ... yes 66/159 ... yes 45/160 ... yes 45/162 ... yes 45/163 ... yes 41/164 ... yes 79/165 ... yes 80/166 ... yes 68/167 ... yes 80/168 ... yes 61/169 ... yes 4/170 ... yes 8/171 ... yes 79/172 ... yes 68/173 ... yes 30/174 ... yes 27/175 ... yes 4/176 ... yes 18/178 ... yes 41/180 ... yes 51/181 ... yes 69/182 ... yes 37/183 ... yes 56/184 ... yes 30/185 ... yes 4/186 ... yes 45/187 ... yes 51/188 ... yes 74/189 ... yes 45/190 ... yes 85/191 ... yes 4/192 ... yes 4/193 ... yes 4/194 ... yes 89/195 ... yes 4/196 ... yes 43/197 ... yes 45/198 ... yes 41/199 ... yes 41/200 ... yes 4/201 ... yes 50/202 ... yes 41/203 ... yes 80/205 ... yes 4/206 ... yes 45/207 ... yes 5/208 ... yes 4/209 ... yes 45/210 ... yes 4/211 ... yes 45/212 ... yes 41/214 ... yes 297/215 ... yes 85/216 ... yes 4/217 ... yes 29/218 ... yes 41/219 ... yes 30/220 ... yes 70/221 ... yes 70/222 ... yes 41/223 ... yes 56/224 ... yes 41/225 ... yes 4/226 ... yes 4/227 ... yes 45/228 ... yes 24/229 ... yes 302/230 ... yes 23/231 ... yes 23/232 ... yes 18/233 ... yes 18/234 ... yes 4/235 ... yes 51/236 ... yes 329/237 ... yes 51/238 ... yes 54/239 ... yes 68/240 ... yes 41/241 ... yes 41/242 ... yes 4/243 ... yes 18/244 ... yes 18/245 ... yes 4/246 ... yes 41/247 ... yes 29/248 ... yes 4/249 ... yes 68/250 ... yes 24/251 ... yes 34/252 ... yes 68/253 ... yes 24/254 ... yes 68/255 ... yes 6/256 ... yes 6/257 ... yes 6/258 ... yes 4/259 ... yes 24/260 ... yes 68/261 ... yes 50/262 ... yes 34/263 ... yes 45/264 ... yes 360/265 ... yes 56/266 ... yes Redis version >= 6.0.0? ... yes Ruby version >= 3.0.6 ? ... yes (3.0.6) Git user has default SSH configuration? ... yes Active users: ... 47 Is authorized keys file accessible? ... yes GitLab configured to store new projects in hashed storage? ... yes All projects are in hashed storage? ... yes Elasticsearch version 7.x-8.x or OpenSearch version 1.x ... skipped (Advanced Search is disabled) All migrations must be finished before doing a major upgrade ... skipped (Advanced Search is disabled)
Checking GitLab App ... Finished
Checking GitLab subtasks ... Finished