Skip to content
Snippets Groups Projects
Commit bdfdc11c authored by Andy Hohenner's avatar Andy Hohenner
Browse files

Shifting Cells Performance documentation to the SSOT

parent 0d493bd3
No related branches found
No related tags found
1 merge request!10579Shifting Cells Performance documentation to the SSOT
......@@ -89,33 +89,7 @@ The [Pre-QA Cell](https://gitlab.com/groups/gitlab-com/gl-infra/-/epics/1293) is
## Performance Testing
Just like our feature testing, performance testing can be done on the [testing levels](https://docs.gitlab.com/ee/development/testing_guide/testing_levels.html). An important thing to note is that performance results from one level are not directly mappable to another level, i.e. a code change that improved a unit or integration test to run 1 second faster will not map to a 1 second improvement in production, there are too many other variables that affect performance to directly map results across levels. How we can use them is as an indicator that we can use in a fast feedback loop, i.e. if we see a test run 2x faster, it should help the performance problem; if it runs 2x slower, it will probably hurt...
### Unit Testing
At the lowest level, we have several gems included in GitLab that can be used to test performance during development that we can use to get feedback before the code is finalized:
- [derailed_benchmarks](https://github.com/zombocom/derailed_benchmarks)
- [benchmark-memory](https://github.com/michaelherold/benchmark-memory)
- [benchmark-ips](https://github.com/evanphx/benchmark-ips)
We also have [rspec-benchmark](https://github.com/piotrmurach/rspec-benchmark) so we can specifically test for performance results in rspec.
#### Database testing
Analyzing slow queries, number of queries generated by page views / actions
#### Observability testing
Observability testing is actively making use of Observability tools to detect trends that would develop into performance issues. Observability can be enhanced with [synthetic testing and monitoring](https://gitlab.com/gitlab-com/gl-infra/scalability/-/issues/3637). We are currently investigating using our Grafana dashboards to detect performance issues.
#### System testing
This level is covered by [GPT](https://gitlab.com/gitlab-org/quality/performance) and [GBPT](https://gitlab.com/gitlab-org/quality/performance-sitespeed).
Once the application is deployed to live environments, traditional performance testing (load testing, stress testing, soak testing,...) can begin with the use of Observability tools to analyze the performance of various components of GitLab (slow sql in Postgres, long running jobs in sidekiq,...). Note, performance tests are not run against live environments per [GPT documentation](https://gitlab.com/gitlab-org/quality/performance/-/blob/main/docs/environment_prep.md#creating-an-admin-user). All performance tests will be run on transitory environments.
For more details on our System level performance testing strategy, please refer to the document covering why customers should trust our [Dedicated/Cells/FedRAMP performance testing approach](https://internal.gitlab.com/handbook/engineering/dedicated-performance-test-strategy/).
Our performance testing approach is a [multi-layered approach](/handbook/engineering/infrastructure-platforms/developer-experience/performance-enablement/performance) that is focused on Shifting Left and Right performance. Shift Left moves performance testing earlier in the process, Shift Right makes data from live environments (production) more Observable so that it can Shift Left.
## Tracking Issues
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment