Skip to content

GitLab Next

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
omnibus-gitlab
omnibus-gitlab
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 882
    • Issues 882
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
    • Iterations
  • Merge requests 74
    • Merge requests 74
  • Requirements
    • Requirements
    • List
  • Operations
    • Operations
    • Incidents
    • Environments
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Analytics
    • Analytics
    • Code Review
    • Insights
    • Issue
    • Repository
    • Value Stream
  • Members
    • Members
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
Collapse sidebar
  • GitLab.org
  • omnibus-gitlabomnibus-gitlab
  • Issues
  • #5241

Closed
Open
Created Apr 10, 2020 by Michael Kozono@mkozono🔶Developer

`replicate-geo-database` incorrectly tries to backup repos, after `revert-pg-upgrade` on a Geo secondary’s read-replica PG node

This was during an attempt to downgrade PG from 11 to 10 in a Geo installation with GitLab 12.9.

replicate-geo-database tried to back up repositories in gitlab-rake gitlab:backup:create. But when the Geo secondary’s read-replica PG DB is running on its own node, Rails is not configured, so you get the errors below.

  • Perhaps we need to run a task that backs up only the DB?
  • The error’s troubleshooting tips are not relevant to this case.
  • --skip-backup works around this
  • Side note: I also needed --force since there was data in the DB. There will always be data in the Geo secondary’s DB when using revert-pg-upgrade or pg-upgrade on an instance that has been used.
mkozono-ansible@mkozono-ha-omnibus4838-secondary-geo:~$ sudo gitlab-ctl replicate-geo-database --slot-name=singleslot --host=35.203.143.171
Found data inside the gitlabhq_production database! If you are sure you are in the secondary server, override with --force
mkozono-ansible@mkozono-ha-omnibus4838-secondary-geo:~$ sudo gitlab-ctl replicate-geo-database --slot-name=singleslot --host=35.203.143.171 --force
Found data inside the gitlabhq_production database! Proceeding because --force was supplied

---------------------------------------------------------------
WARNING: Make sure this script is run from the secondary server
---------------------------------------------------------------

*** You are about to delete your local PostgreSQL database, and replicate the primary database. ***
*** The primary geo node is `35.203.143.171` ***

*** Are you sure you want to continue (replicate/no)? ***
Confirmation: replicate
* Executing GitLab backup task to prevent accidental data loss
2020-04-09 23:52:03 +0000 -- Dumping database ...
Dumping PostgreSQL database gitlabhq_production ... [DONE]
2020-04-09 23:52:04 +0000 -- done
2020-04-09 23:52:04 +0000 -- Dumping repositories ...
 * root/asdf (@hashed/6b/86/6b86b273ff34fce19d6b804eff5a3f5747ada4eaa22f1d49c01e52ddb7875b4b) ...
rake aborted!
GRPC::Unavailable: 14:failed to connect to all addresses
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/gitaly_client.rb:192:in `execute'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/gitaly_client.rb:170:in `block in call'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/gitaly_client.rb:198:in `measure_timings'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/gitaly_client.rb:169:in `call'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/gitaly_client/repository_service.rb:19:in `exists?'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/git/repository.rb:98:in `exists?'
/opt/gitlab/embedded/service/gitlab-rails/app/models/repository.rb:517:in `exists?'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:84:in `block (2 levels) in cache_method_asymmetrically'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache.rb:44:in `fetch_without_caching_false'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:179:in `block (2 levels) in cache_method_output_asymmetrically'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/null_request_store.rb:34:in `fetch'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/safe_request_store.rb:12:in `fetch'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache.rb:25:in `fetch'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:178:in `block in cache_method_output_asymmetrically'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/utils/strong_memoize.rb:30:in `strong_memoize'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:192:in `block in memoize_method_output'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:201:in `no_repository_fallback'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:191:in `memoize_method_output'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:177:in `cache_method_output_asymmetrically'
/opt/gitlab/embedded/service/gitlab-rails/lib/gitlab/repository_cache_adapter.rb:83:in `block in cache_method_asymmetrically'
/opt/gitlab/embedded/service/gitlab-rails/app/models/repository.rb:525:in `empty?'
/opt/gitlab/embedded/service/gitlab-rails/app/models/repository.rb:353:in `expire_emptiness_caches'
/opt/gitlab/embedded/service/gitlab-rails/lib/backup/repository.rb:155:in `empty_repo?'
/opt/gitlab/embedded/service/gitlab-rails/lib/backup/repository.rb:25:in `block in dump'
/opt/gitlab/embedded/service/gitlab-rails/lib/backup/repository.rb:16:in `dump'
/opt/gitlab/embedded/service/gitlab-rails/lib/tasks/gitlab/backup.rake:99:in `block (4 levels) in <top (required)>'
/opt/gitlab/embedded/service/gitlab-rails/lib/tasks/gitlab/backup.rake:11:in `block (3 levels) in <top (required)>'
/opt/gitlab/embedded/bin/bundle:23:in `load'
/opt/gitlab/embedded/bin/bundle:23:in `<main>'
Tasks: TOP => gitlab:backup:repo:create
(See full trace by running task with --trace)
*** Initial replication failed! ***

Replication tool returned with a non zero exit status!

Troubleshooting tips:
  - replication should be run by root user
  - check if `roles ['geo_primary_role']` or `geo_primary_role['enable'] = true` exists in `gitlab.rb` on the primary node
  - check your trust settings `md5_auth_cidr_addresses` in `gitlab.rb` on the primary node

Failed to execute: gitlab-rake gitlab:backup:create
Assignee
Assign to
None
Milestone
None
Assign milestone
Time tracking
None
Due date
None