500 gitlab-rails GRPC::Unavailable failed to connect to all addresses / Failed to pick subchannel
Everyone can contribute. Help move this issue forward while earning points, leveling up and collecting rewards.
Hi,
We are running a docker gitlab instance that was subject to a critical security issue (12.9.2, CVE-2021-22205), resulting in the instance being used to mine cryptocurrencies.
In order to stop it from being attacked, I upgraded the instance using the following path: 12.9.2 -> 12.10.14 -> 13.0.14 -> 13.1.11 -> 13.8.8
Now most of the interface loads but accessing a project results in a 500 with the following log. This of course is an emergency because our whole service is currently down due to this issue.
==> /var/log/gitlab/gitlab-rails/production.log <==
GRPC::Unavailable (14:failed to connect to all addresses. debug_error_string:{"created":"@1656960642.226945055","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3952,"referenced_errors":[{"created":"@1656960537.399036740","description":"failed to connect to all addresses","file":"src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":394,"grpc_status":14}]}):
lib/gitlab/gitaly_client.rb:177:in `execute'
lib/gitlab/gitaly_client/call.rb:18:in `block in call'
lib/gitlab/gitaly_client/call.rb:55:in `recording_request'
lib/gitlab/gitaly_client/call.rb:17:in `call'
lib/gitlab/gitaly_client.rb:167:in `call'
lib/gitlab/gitaly_client/repository_service.rb:19:in `exists?'
lib/gitlab/git/repository.rb:98:in `exists?'
app/models/repository.rb:539:in `exists?'
lib/gitlab/metrics/instrumentation.rb:160:in `block in _uncached_exists?'
lib/gitlab/metrics/method_call.rb:27:in `measure'
lib/gitlab/metrics/instrumentation.rb:160:in `_uncached_exists?'
lib/gitlab/repository_cache_adapter.rb:92:in `block (2 levels) in cache_method_asymmetrically'
lib/gitlab/repository_cache.rb:44:in `fetch_without_caching_false'
lib/gitlab/repository_cache_adapter.rb:187:in `block (2 levels) in cache_method_output_asymmetrically'
lib/gitlab/safe_request_store.rb:12:in `fetch'
lib/gitlab/repository_cache.rb:25:in `fetch'
lib/gitlab/repository_cache_adapter.rb:186:in `block in cache_method_output_asymmetrically'
lib/gitlab/utils/strong_memoize.rb:30:in `strong_memoize'
lib/gitlab/repository_cache_adapter.rb:200:in `block in memoize_method_output'
lib/gitlab/repository_cache_adapter.rb:209:in `no_repository_fallback'
lib/gitlab/repository_cache_adapter.rb:199:in `memoize_method_output'
lib/gitlab/repository_cache_adapter.rb:185:in `cache_method_output_asymmetrically'
lib/gitlab/repository_cache_adapter.rb:91:in `block in cache_method_asymmetrically'
lib/gitlab/metrics/instrumentation.rb:160:in `block in exists?'
lib/gitlab/metrics/method_call.rb:27:in `measure'
lib/gitlab/metrics/instrumentation.rb:160:in `exists?'
app/models/repository.rb:547:in `empty?'
lib/gitlab/metrics/instrumentation.rb:160:in `block in empty?'
lib/gitlab/metrics/method_call.rb:27:in `measure'
lib/gitlab/metrics/instrumentation.rb:160:in `empty?'
app/models/concerns/has_repository.rb:79:in `empty_repo?'
app/controllers/projects/application_controller.rb:72:in `require_non_empty_project'
app/controllers/application_controller.rb:482:in `set_current_admin'
lib/gitlab/session.rb:11:in `with_session'
app/controllers/application_controller.rb:473:in `set_session_storage'
lib/gitlab/i18n.rb:73:in `with_locale'
lib/gitlab/i18n.rb:79:in `with_user_locale'
app/controllers/application_controller.rb:467:in `set_locale'
lib/gitlab/error_tracking.rb:52:in `with_context'
app/controllers/application_controller.rb:532:in `sentry_context'
app/controllers/application_controller.rb:460:in `block in set_current_context'
lib/gitlab/application_context.rb:56:in `block in use'
lib/gitlab/application_context.rb:56:in `use'
lib/gitlab/application_context.rb:22:in `with_context'
app/controllers/application_controller.rb:451:in `set_current_context'
lib/gitlab/metrics/elasticsearch_rack_middleware.rb:16:in `call'
lib/gitlab/middleware/rails_queue_duration.rb:33:in `call'
lib/gitlab/metrics/rack_middleware.rb:16:in `block in call'
lib/gitlab/metrics/transaction.rb:56:in `run'
lib/gitlab/metrics/rack_middleware.rb:16:in `call'
lib/gitlab/request_profiler/middleware.rb:17:in `call'
lib/gitlab/jira/middleware.rb:19:in `call'
lib/gitlab/middleware/go.rb:20:in `call'
lib/gitlab/etag_caching/middleware.rb:21:in `call'
lib/gitlab/middleware/multipart.rb:172:in `call'
lib/gitlab/middleware/read_only/controller.rb:50:in `call'
lib/gitlab/middleware/read_only.rb:18:in `call'
lib/gitlab/middleware/same_site_cookies.rb:27:in `call'
lib/gitlab/middleware/handle_malformed_strings.rb:21:in `call'
lib/gitlab/middleware/basic_health_check.rb:25:in `call'
lib/gitlab/middleware/handle_ip_spoof_attack_error.rb:25:in `call'
lib/gitlab/middleware/request_context.rb:23:in `call'
config/initializers/fix_local_cache_middleware.rb:9:in `call'
lib/gitlab/metrics/requests_rack_middleware.rb:76:in `call'
lib/gitlab/middleware/release_env.rb:12:in `call'
I found a few threads that report similar but not exactly matching issues:
Our config is basically the default, I've checked in gitlab.rb everything is commented-out, and we're using this config on the docker-compose env:
environment:
GITLAB_OMNIBUS_CONFIG: |
external_url 'https://<redacted>'
nginx['listen_port'] = 80
nginx['listen_https'] = false
nginx['http2_enabled'] = false
nginx['proxy_set_headers'] = {
"Host" => "$$http_host",
"X-Real-IP" => "$$remote_addr",
"X-Forwarded-For" => "$$proxy_add_x_forwarded_for",
"X-Forwarded-Proto" => "https",
"X-Forwarded-Ssl" => "on"
}
gitlab_rails['smtp_enable'] = true
gitlab_rails['smtp_address'] = "<redacted>"
gitlab_rails['smtp_port'] = 587
gitlab_rails['smtp_authentication'] = "plain"
gitlab_rails['smtp_enable_starttls_auto'] = true
gitlab_rails['smtp_user_name'] = "<redacted>"
gitlab_rails['smtp_password'] = "<redacted>"
gitlab_rails['smtp_domain'] = "<redacted>"
Happy to give more info, we are in a really bad situation with this currently.
