Skip to content

Code Review Analytics: Discovery

Problem to solve

From our research spike in 12.5, Code Review Analytics was voted as a top priority for both GitLab and customer Engineering Directors.

Engineering directors are looking for better ways to improve code review in their teams and need additional data, insights and recommendations to inform their decision-making process.

We hypothesise that Directors have 3 key goals to accomplish with code review:

  • Process - Streamlining my team’s code review process
  • Product - Protecting my product quality with code review
  • People - Coaching my people around code review

Under each of these goals we have identified some of the most useful JTBD:

# Situation Motivation Outcomes Strategic/Tactical
1 When I am streamlining my team’s code review process I want to get an overview of the current MRs in review so that I can follow up on outlier MRs that are too slow or fast Tactical
2 When I am streamlining my team’s code review process I want to see a trend of the average time it takes for MRs to be reviewed so that I can see whether we are improving the speed of our review process Strategic
3 When I am protecting my product quality with code review I want to see any unreviewed MRs that have already been merged so that I can identify risk and minimise unreviewed MRs Tactical
4 When I am protecting my product quality with code review I want to correlate code review with changes in quality so that I can understand how my process is impacting quality Strategic
5 When I am coaching my people around code review I want to see how responsive and effective a reviewer is so that I can understand their impact on the team as a reviewer Strategic
6 When I am coaching my people around code review I want to see how satisfied an author is with an approver so that I can understand how the reviewer is perceived and how they can improve their practice Strategic

Intended users

Proposal

MVC: See Mural For the MVC, we propose tackling Job Story 1 - this should help to get something relatively useful to a broad audience out quickly, and lay some foundations for deeper analytics along the way. Screenshot_2019-12-10_at_11.43.09

Approach:

  1. Collaborate with Create:Source Code team
  2. Define Jobs to be done we wish to tackle
  3. Design MVC in 12.6
  4. Build MVC in 12.7
  5. Test & dog-food with GitLab directors & managers
  6. Gather customer feedback

Documentation

What does success look like, and how can we measure that?

  • MVC shipped within one milestone
  • MVC receives user & GitLabber feedback

What is the type of buyer?

Ultimately, we will consider analytics at all buyer levels, but for the MVC I propose project-level analytics for a Starter Buyer.

Links / references

Edited by Nick Post