Intermittent SSL_connect SYSCALL on GitLab.com
Summary
In a GitLab.com project (see internal comment as it is a customer project), the CI validation will intermittently fail with a misleading error message (It seems this message is misleading as the CI configuration is not invalid but an error was encounter during the CI validation.) :
This GitLab CI configuration is invalid:
SSL_connect SYSCALL returned=5 errno=0 state=SSLv3/TLS write client hello
https://log.gprd.gitlab.net/goto/1d718730-5ac5-11ed-8d37-e9a2f393ea2a
json.correlation_id
f9d084686973b1345b8701c29f3d5603
json.exception.backtrace
lib/gitlab/net_http_adapter.rb:21:in `connect',
lib/gitlab/http.rb:54:in `perform_request',
lib/gitlab/ci/config/external/file/remote.rb:44:in `block in fetch_remote_content',
lib/gitlab/ci/pipeline/logger.rb:33:in `instrument',
lib/gitlab/ci/config/external/file/remote.rb:43:in `fetch_remote_content',
lib/gitlab/ci/config/external/file/remote.rb:18:in `block in content',
lib/gitlab/utils/strong_memoize.rb:44:in `strong_memoize',
lib/gitlab/ci/config/external/file/remote.rb:18:in `content',
lib/gitlab/ci/config/external/file/base.rb:104:in `validate_content!',
lib/gitlab/ci/config/external/file/base.rb:53:in `block in validate!',
lib/gitlab/ci/pipeline/logger.rb:33:in `instrument',
lib/gitlab/ci/config/external/file/base.rb:50:in `validate!',
lib/gitlab/ci/config/external/mapper.rb:130:in `verify!',
lib/gitlab/ci/config/external/mapper.rb:50:in `each',
lib/gitlab/ci/config/external/mapper.rb:50:in `process_without_instrumentation',
lib/gitlab/ci/config/external/mapper.rb:31:in `block in process',
lib/gitlab/ci/pipeline/logger.rb:33:in `instrument',
lib/gitlab/ci/config/external/mapper.rb:30:in `process',
lib/gitlab/ci/config/external/processor.rb:14:in `initialize',
lib/gitlab/ci/config.rb:131:in `new',
lib/gitlab/ci/config.rb:131:in `block in build_config',
lib/gitlab/ci/pipeline/logger.rb:33:in `instrument',
lib/gitlab/ci/config.rb:130:in `build_config',
ee/lib/ee/gitlab/ci/config_ee.rb:18:in `build_config',
lib/gitlab/ci/config.rb:114:in `expand_config',
lib/gitlab/ci/config.rb:39:in `block in initialize',
lib/gitlab/ci/pipeline/logger.rb:33:in `instrument',
lib/gitlab/ci/config.rb:38:in `initialize',
lib/gitlab/ci/yaml_processor.rb:31:in `new',
lib/gitlab/ci/yaml_processor.rb:31:in `parse_config',
lib/gitlab/ci/yaml_processor.rb:20:in `block in execute',
lib/gitlab/ci/yaml_processor/feature_flags.rb:24:in `with_actor',
lib/gitlab/ci/yaml_processor.rb:19:in `execute',
lib/gitlab/ci/lint.rb:80:in `block in yaml_processor_result',
lib/gitlab/ci/pipeline/logger.rb:33:in `instrument',
lib/gitlab/ci/lint.rb:76:in `yaml_processor_result',
lib/gitlab/ci/lint.rb:62:in `static_validation',
lib/gitlab/ci/lint.rb:38:in `validate',
app/graphql/resolvers/ci/config_resolver.rb:38:in `resolve',
lib/gitlab/graphql/present/field_extension.rb:18:in `resolve',
lib/gitlab/graphql/tracers/timer_tracer.rb:20:in `trace',
lib/gitlab/graphql/generic_tracing.rb:48:in `with_labkit_tracing',
lib/gitlab/graphql/generic_tracing.rb:38:in `platform_trace',
lib/gitlab/graphql/tracers/logger_tracer.rb:14:in `trace',
lib/gitlab/graphql/tracers/metrics_tracer.rb:13:in `trace',
lib/gitlab/graphql/tracers/application_context_tracer.rb:23:in `trace',
lib/gitlab/graphql/tracers/timer_tracer.rb:20:in `trace',
lib/gitlab/graphql/generic_tracing.rb:48:in `with_labkit_tracing',
lib/gitlab/graphql/generic_tracing.rb:38:in `platform_trace',
lib/gitlab/graphql/tracers/logger_tracer.rb:14:in `trace',
lib/gitlab/graphql/tracers/metrics_tracer.rb:13:in `trace',
lib/gitlab/graphql/tracers/application_context_tracer.rb:20:in `block in trace',
lib/gitlab/application_context.rb:113:in `block in use',
lib/gitlab/application_context.rb:113:in `use',
lib/gitlab/application_context.rb:54:in `with_context',
lib/gitlab/graphql/tracers/application_context_tracer.rb:19:in `trace',
lib/gitlab/graphql/tracers/timer_tracer.rb:20:in `trace',
lib/gitlab/graphql/generic_tracing.rb:48:in `with_labkit_tracing',
lib/gitlab/graphql/generic_tracing.rb:38:in `platform_trace',
lib/gitlab/graphql/tracers/logger_tracer.rb:14:in `trace',
lib/gitlab/graphql/tracers/metrics_tracer.rb:13:in `trace',
lib/gitlab/graphql/tracers/application_context_tracer.rb:23:in `trace',
app/graphql/gitlab_schema.rb:51:in `multiplex',
app/controllers/graphql_controller.rb:167:in `execute_query',
app/controllers/graphql_controller.rb:57:in `execute',
ee/lib/gitlab/ip_address_state.rb:10:in `with',
ee/app/controllers/ee/application_controller.rb:45:in `set_current_ip_address',
app/controllers/application_controller.rb:530:in `set_current_admin',
lib/gitlab/session.rb:11:in `with_session',
app/controllers/application_controller.rb:521:in `set_session_storage',
lib/gitlab/i18n.rb:107:in `with_locale',
lib/gitlab/i18n.rb:113:in `with_user_locale',
app/controllers/application_controller.rb:515:in `set_locale',
app/controllers/application_controller.rb:509:in `set_current_context',
ee/lib/omni_auth/strategies/group_saml.rb:41:in `other_phase',
lib/gitlab/metrics/elasticsearch_rack_middleware.rb:16:in `call',
lib/gitlab/middleware/memory_report.rb:13:in `call',
lib/gitlab/middleware/speedscope.rb:13:in `call',
lib/gitlab/database/load_balancing/rack_middleware.rb:23:in `call',
lib/gitlab/middleware/rails_queue_duration.rb:33:in `call',
lib/gitlab/metrics/rack_middleware.rb:16:in `block in call',
lib/gitlab/metrics/web_transaction.rb:46:in `run',
lib/gitlab/metrics/rack_middleware.rb:16:in `call',
lib/gitlab/jira/middleware.rb:19:in `call',
lib/gitlab/middleware/go.rb:20:in `call',
lib/gitlab/etag_caching/middleware.rb:21:in `call',
lib/gitlab/middleware/query_analyzer.rb:11:in `block in call',
lib/gitlab/database/query_analyzer.rb:37:in `within',
lib/gitlab/middleware/query_analyzer.rb:11:in `call',
lib/gitlab/middleware/multipart.rb:173:in `call',
lib/gitlab/middleware/read_only/controller.rb:50:in `call',
lib/gitlab/middleware/read_only.rb:18:in `call',
lib/gitlab/middleware/same_site_cookies.rb:27:in `call',
lib/gitlab/middleware/handle_malformed_strings.rb:21:in `call',
lib/gitlab/middleware/basic_health_check.rb:25:in `call',
lib/gitlab/middleware/handle_ip_spoof_attack_error.rb:25:in `call',
lib/gitlab/middleware/request_context.rb:21:in `call',
lib/gitlab/middleware/webhook_recursion_detection.rb:15:in `call',
config/initializers/fix_local_cache_middleware.rb:11:in `call',
lib/gitlab/middleware/compressed_json.rb:26:in `call',
lib/gitlab/middleware/rack_multipart_tempfile_factory.rb:19:in `call',
lib/gitlab/middleware/sidekiq_web_static.rb:20:in `call',
lib/gitlab/metrics/requests_rack_middleware.rb:77:in `call',
lib/gitlab/middleware/release_env.rb:13:in `call'
json.exception.class
OpenSSL::SSL::SSLError
json.exception.message
SSL_connect SYSCALL returned=5 errno=0 state=SSLv3/TLS write client hello
Steps to reproduce
On the CI Lint page of the customer , refresh few times to trigger the error message.
Example Project
What is the current bug behavior?
What is the expected correct behavior?
Relevant logs and/or screenshots
Output of checks
Results of GitLab environment info
Expand for output related to GitLab environment info
(For installations with omnibus-gitlab package run and paste the output of: `sudo gitlab-rake gitlab:env:info`) (For installations from source run and paste the output of: `sudo -u git -H bundle exec rake gitlab:env:info RAILS_ENV=production`)
Results of GitLab application Check
Expand for output related to the GitLab application check
(For installations with omnibus-gitlab package run and paste the output of:
sudo gitlab-rake gitlab:check SANITIZE=true
)(For installations from source run and paste the output of:
sudo -u git -H bundle exec rake gitlab:check RAILS_ENV=production SANITIZE=true
)(we will only investigate if the tests are passing)
Possible fixes
Add a retry logic to include:remote
see comment