Measure mean ramp time for new developers
Problem to solve
Reducing developer ramp time
Intended users
Engineering managers
Further details
Hypothesis:
- The more complex your toolchain, the longer it takes for new developers to reach full productivity.
- The more complex the toolchain, the harder it is for your team members to move between teams.
- With a simpler toolchain, the developer ramp time is reduced.
Currently GitLab does not have a way to specify a user as a new joiner and measure their productivity over time versus the mean group. Having this would help new users and cohorts in their adoption of GitLab - as well as their managers identify areas for improvement/best practice.
Proposal
- As an engineering manager, I want to be able to measure the ramp time for my new team members.
- To do this, I require a baseline for current team productivity.
- This, with GitLab, can be the number of commits or MR's a team member/project member makes over time.
- I want to be able to compare this mean time for the group, against my new joiners, to see how long it takes them to get to fully productive.
This makes sense to display as a moving average graph plotting different values:
- Fully ramped team members average;
- Cohort of new joiners average;
- Individuals results.
This will allow the engineering manager to review his global view of his team as a whole. See if productivity of the fully ramped group is changing over time, see the impact of efforts to reduce onboarding time and for an individual user, see how their onboarding is going.
There would be scope to iterate on this over time, by showing the range of different users and quickly identify them: who are our star performers? Who might need extra coaching or support?
Additionally, this will give the customer/users data on how the productivity of GitLab usage differs within their organisation. It will also give data for them to compare one cohort of users to another, where one might for example be using GitLab just for SCM, one may be using SCM + CI etc. This probably requires a separate issue to discuss what the most boring solution would be.
Permissions and Security
Likely a maintainer.
Documentation
Testing
Unsure
What does success look like, and how can we measure that?
Success is that the feature is (a) used to measure onboarding successfully; (b) used regularly (both measured via Snowplow); (c) that it receives positive feedback from users (survey).
What is the type of buyer?
Unsure on what tier this should go into at this stage. Persona is the engineering manager.

