Data Request: data regarding the effectiveness of the 2-engineer verification requirement
Description
The groupthreat insights and groupsecurity policies groups have an explicit and rigid verification process.
It was added with no data justifying it and also without a metric to track its usefulness or effectiveness.
groupcompliance has a similar "2 engineer verification" process, though it is more flexible as it allows for the use of the ~verified-by-author
label
Similar processes is now being rolled out to new teams, without any way to track or measure the impact of such processes.
I would like to remove or loosen this process, or at least define a metric to track its usefulness.
I have an draft MR to iterate on that, collect data, etc.
As part of that effort I would like some help gathering analytics to compare:
- groupthreat insights, groupsecurity policies, groupcompliance before/after having this process
- groupthreat insights, groupsecurity policies, groupcompliance compared to other groups without this process
Data requested
the data I would love to have is the historical data that covers prior to this process and after it:
- issues completed/missed milestone
- bug counts
- average time to close issue
- these numbers compared with dev teams that do not have this formal "a separate engineer must run validation"
the process was first added in September of 2020 (gitlab-com/www-gitlab-com@ee9e0e4c)
I am also open to any analytics that you think may be useful to determine if this process is a useful one
Acceptance Criteria / Functional Requirements
- Technical requirements
-
fill in your requirement here
- Interface requirements
-
fill in your requirement here
- Quality requirements
-
fill in your requirement here
- Performance requirements
-
fill in your requirement here
Implementation Details
High level task breakdown
-
fill in task breakdown here