Prevent merge on code quality degradation
<!-- triage-serverless v3 PLEASE DO NOT REMOVE THIS SECTION --> *This page may contain information related to upcoming products, features and functionality. It is important to note that the information presented is for informational purposes only, so please do not rely on the information for purchasing or planning purposes. Just like with all projects, the items mentioned on the page are subject to change or delay, and the development, release, and timing of any products, features, or functionality remain at the sole discretion of GitLab Inc.* <!-- triage-serverless v3 PLEASE DO NOT REMOVE THIS SECTION --> ### Problem to solve <!-- What problem do we solve? --> Some teams need or want to fail a pipeline if there are any degradations in [Code quality](https://docs.gitlab.com/ee/user/project/merge_requests/code_quality.html) between the source and target branches that require review and approval. This is similar to the existing pattern of [Security approvals](https://docs.gitlab.com/ee/user/application_security/index.html#security-approvals-in-merge-requests) already in place in Merge Requests in GitLab. Code Quality checks/failures do not come with the regulatory requirements that security scans do. <p> <summary><h3>Original Issue Writeup</h3></summary> <details> ## Overview For GitLab Runner we use the [Code quality](https://docs.gitlab.com/ee/user/project/merge_requests/code_quality.html) feature but ended up extending it so it fails the pipeline if there are any reports/degradations. This is one improvement we can make for https://gitlab.com/gitlab-org/gitlab/issues/33747 ## Proposal If there are any reports or any new degradations from the Code quality report it should prevent from the merge request from being merged. Since turning it on by default would be a breaking change for most users it might be a good idea to make it configurable. </details> </p> ### Intended users #### Primary User * [Sasha (Software Developer)](https://about.gitlab.com/handbook/marketing/product-marketing/roles-personas/#sasha-software-developer) - who can confirm that common checks are now completed and resolve issues early in the development cycle that today happen later. This speeds up the time to resolution for these issues by ensuring the context is fresh. #### Users who get secondary benefits * [Rachel (Release Manager)](https://about.gitlab.com/handbook/marketing/product-marketing/roles-personas/#rachel-release-manager) - who wants to ensure that common issues are checked for and resolved before approving and executing on a release. * [Delaney (Development Team Lead)](https://about.gitlab.com/handbook/marketing/product-marketing/roles-personas/#delaney-development-team-lead) - who can now ensure that coding standards across the team are a little more uniform leading to more readable code. * [Simone (Software Engineer in Test)](https://about.gitlab.com/handbook/marketing/product-marketing/roles-personas/#simone-software-engineer-in-test) - who can now trust that code quality didn't decrease and focus on testing at the edges of the new functionality. * [Cameron (Compliance Manager)](https://about.gitlab.com/handbook/marketing/product-marketing/roles-personas/#cameron-compliance-manager) - who wants to enforce standards in the org around code quality as part of the pipeline and day to day developer workflow. ### Further details <!-- Include use cases, benefits, and/or goals (contributes to our vision?) --> The Code Quality findings [already have a severity](https://docs.gitlab.com/ee/ci/testing/code_quality.html#implement-a-custom-tool), why don’t we have a `CodeQuality-Check` rule (or `Code-Quality-Check`) similar to the `Vulnerability-Check` [rule](https://docs.gitlab.com/ee/user/application_security/#security-approvals-in-merge-requests)? I propose that anything with a severity of `major`, `critical`, or `blocker` trigger a dynamic rule when one is put in place to prevent the code from being merged. #### Use Cases * When Code Quality decreases, teams want a pipeline to fail, so that they can ensure that the code quality and standards the team agreed to are being met. * When Code Quality standards change, a team wants to be able to update the threshold for failing a pipeline, so that the job failure is not ignored and code quality standards are being kept up. * We have heard that some users do not want this / would not enable it without the ability to whitelist / ignore some rules on a project by project basis. As such this likely needs to follow or deploy alongside https://gitlab.com/gitlab-org/gitlab/-/issues/221237 ### Proposal <!-- How are we going to solve the problem? Try to include the user journey! https://about.gitlab.com/handbook/journeys/#user-journey --> 1. Existing jobs and users consuming the existing template will continue to have jobs pass after completion regardless of the outcome 1. Users can customize the thresholds for jobs / turn off jobs that are too noise through existing customization of the [codeclimate.yml file](https://docs.codeclimate.com/docs/advanced-configuration). 1. A logical follow-on feature set would be to build a code quality dashboard that enables functionality like enabling/disabling checks though a UI, logging specific issues in specific files to track progress against through GitLab Issues and visualizing code quality over time. Locations where this would be implemented * [ ] [`ee/app/models/concerns/approval_rule_like.rb`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/app/models/concerns/approval_rule_like.rb) * [ ] [`ee/app/assets/javascripts/approvals/components/rule_name.vue`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/app/assets/javascripts/approvals/components/rule_name.vue) * [ ] [`ee/app/assets/javascripts/approvals/mount_project_settings.js`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/app/assets/javascripts/approvals/mount_project_settings.js) * [ ] [`ee/spec/frontend/approvals/components/rule_name_spec.js`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/spec/frontend/approvals/components/rule_name_spec.js) * [ ] [`ee/spec/frontend/approvals/components/security_configuration/unconfigured_security_rules_spec.js`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/spec/frontend/approvals/components/security_configuration/unconfigured_security_rules_spec.js) * [ ] [`ee/app/assets/javascripts/approvals/components/security_configuration/unconfigured_security_rules.vue`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/app/assets/javascripts/approvals/components/security_configuration/unconfigured_security_rules.vue) * [ ] [`ee/app/assets/javascripts/approvals/constants.js`](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/app/assets/javascripts/approvals/constants.js) ### Permissions and Security <!-- What permissions are required to perform the described actions? Are they consistent with the existing permissions as documented for users, groups, and projects as appropriate? Is the proposed behavior consistent between the UI, API, and other access methods (e.g. email replies)?--> ### Documentation <!-- See the Feature Change Documentation Workflow https://docs.gitlab.com/ee/development/documentation/feature-change-workflow.html Add all known Documentation Requirements here, per https://docs.gitlab.com/ee/development/documentation/feature-change-workflow.html#documentation-requirements If this feature requires changing permissions, this document https://docs.gitlab.com/ee/user/permissions.html must be updated accordingly. --> * Update existing [documentation]() about how to modify the job to fail on code quality degradation. ### Availability & Testing <!-- This section needs to be retained and filled in during the workflow planning breakdown phase of this feature proposal, if not earlier. What risks does this change pose to our availability? How might it affect the quality of the product? What additional test coverage or changes to tests will be needed? Will it require cross-browser testing? Please list the test areas (unit, integration and end-to-end) that needs to be added or updated to ensure that this feature will work as intended. Please use the list below as guidance. * Unit test changes * Integration test changes * End-to-end test change See the test engineering planning process and reach out to your counterpart Software Engineer in Test for assistance: https://about.gitlab.com/handbook/engineering/quality/test-engineering/#test-planning --> ### What does success look like, and how can we measure that? <!-- Define both the success metrics and acceptance criteria. Note that success metrics indicate the desired business outcomes, while acceptance criteria indicate when the solution is working correctly. If there is no way to measure success, link to an issue that will implement a way to measure this. --> * This feature is being implemented as part of a ~dogfooding effort so success is that internal GitLab teams start using Code Quality in their own pipelines. * Externally we expect to see no decrease in the number of customers using the code quality job in their pipelines. ### What is the type of buyer? <!-- Which leads to: in which enterprise tier should this feature go? See https://about.gitlab.com/handbook/product/pricing/#four-tiers --> This feature really starts to be valuable when a team or entire engineering organization is trying to enforce standards in coding OR has a regulatory requirement to show a level of testing through static checks so the likely buyer is a Manager or above putting this in the ~"GitLab Starter" tier. ### Is this a cross-stage feature? <!-- Communicate if this change will affect multiple Stage Groups or product areas. We recommend always start with the assumption that a feature request will have an impact into another Group. Loop in the most relevant PM and Product Designer from that Group to provide strategic support to help align the Group's broader plan and vision, as well as to avoid UX and technical debt. https://about.gitlab.com/handbook/product/#cross-stage-features --> No. ### Links / references
issue