Skip to content

FY22 product direction and goals for Optimize

Overview

The Optimize team has recently built out a series of MVPs that provide Director and above leadership with insights into the adoption of GitLab and DevOps practices across their organization (Usage Trends and DevOps Reports), as well as analytics to understand the efficiency of their SDLC (VSA).

The next challenge is to mature each of these features and connect Usage Trends, DevOps Reports, and VSA together so that leadership can:

  • Get the insights they need to understand how quickly and reliably value is being delivered to customers
  • Identify which teams are successful and why
  • Identify which teams have room for improvement and share strategies that have helped more successful teams
  • Quickly identify bottlenecks and take action to improve cycle time
  • Measure the impact of changes in devops practices and team practices to understand the ROI
  • Track software delivery performance against business goals - stretch goal

The goal of this issue is to facilitate discussion, formulate a vision and plan for FY22, and create next steps for customer validation and implementation. The vision is subject to change because we want to remain flexible and adapt to customer feedback. The plan should be a set of MVCs that lead to product maturity over time.

Problems to solve

Target personas

Engineering and Product leadership (Engineering/Product Manager, Director, VP, CTO/CPO/CEO)

Key questions to answer

  1. Which are my top performing teams in the organization?
  2. I want to easily see what bottlenecks my teams are facing by comparing their performance against other teams in my organization and getting visual indicators of areas that need attention?
  3. For teams that are performing exceptionally well in a specific stage of the SDLC, I want to see if there is a particular DevOps practice they all adopted that contributed to their success so I can introduce that practice to teams with lower performance in that stage.
  4. After introducing the new practice, I want to visually see how their performance in that stage changes over time so I can measure the impact.
  5. Overall, how does DevOps adoption correlate with development performance?
  6. How can I see teams with high adoption and compare their performance against teams with low adoption so that I can evaluate the ROI in GitLab?
  7. How can I set and track goals to improve performance?
  8. How can I see which GitLab stage/feature is most likely to get me closer to my performance goals?
  9. Which value stream metrics shifted the most as a result of adopting feature x?

Jobs To Be Done

See this list of JTBD

Goals

  • Help GitLab users optimize their software development processes to deliver better software faster
  • Encourage stage adoption
  • Demonstrate value of tier upgrades
  • Expand the number of GitLab users within an organization by demonstrating the ROI. Contribute to the goal of 100M TMAU by 2023.
  • Provide more links from analytics features to other areas of the product where relevant data already exists and users can drill down into more detail to take action
  • Work with other product teams to create better flow and collaborate on roadmap ideas

Plan

The list below is ambitious given the size of the Optimize team. We will refine and prioritize this list as we go through the customer validation phase. Some of these features have already been validated by customers and we are confident that we should move forward with implementing a solution. Other features still require validation.

  1. Make instance-level analytics available at the project and group level. Make VSA available at the instance level. Time this with the efforts to create an instance-level workspace and merge projects and groups.
  2. At the project, group, and instance level, have an Overview stage in VSA that shows all four DORA metrics as shown here
  3. In the VSA Overview stage, have a green up or red down arrow to indicate how the metric has changed (GitHub has this in their UI)
  4. Use DORA metrics as the key measure of performance. In the DevOps Adoption table, have a column that shows a single visual indicator of performance where performance is the aggregation of all of the DORA metrics and the color coding that is used to indicate health is based on a comparison with other teams or projects. Alternatively, display a badge for Elite, High, Medium, and Low as defined in gitlab-org&4358 (closed)
  5. Next to the visual indicator, show if performance has increased or decreased.
  6. A user can click on the visual indicator to go to the Overview stage in VSA and get a breakdown of the DORA metrics.
  7. Users can click on the DORA metric in the VSA overview stage and it links to a graph showing trend over time for that metric
  8. A user can sort related metrics to identify bottlenecks
  9. At the project, group, and instance level, add DevOps Stages and key stage features to the adoption table similar to this proposal. CS measures may also be a useful guide.
  10. Show a DevOps score column in the adoption table to show what percentage of DevOps practices have been adopted
  11. Support sorting the DevOps adoption table by the DevOps score or performance score to find the highest/lowest adopters and performers
  12. Support clicking into the DevOps adoption table so users are linked through to a more detailed view. For example, clicking on Issues in a row links to the issues list for that group or project. Alternatively, link to metrics that relate to that feature (e.g. MR analytics)
  13. Support being able to filter by a specific DORA metric so that a team with low performance for that metric can see what DevOps practices are being used by higher performing teams
  14. For the default VSA stages in the horizontal nav, show the DevOps stages.
  15. When clicking on a DevOps stage, show a set of metrics that reflect important performance data for that stage (where we currently have the Time and Recent Activity boxes)
  16. Show a green up or red down arrow next to each of the default performance metrics so a user can easily see if they are trending in the desired direction
  17. Support clicking on one of the metrics above and linking through to a page within the product that is relevant to that metric.
  18. Add additional start and stop events to support more use cases for VSA stages
  19. Make tooltips in VSA more contextual
  20. Extend usage trends to link with VSA and DevOps Reports and show a set of graphs that measure changes in performance against devops adoption. Perhaps include a histogram icon in the adoption table that links to a dashboard showing when a feature was adopted and how key metrics have changed since then
  21. Support adding performance goals and tracking against them. This could be limited to DORA metrics initially, then expanded to each of the default stages.
  22. Make the stage in the VSA horizontal flow a different color to indicate health (if a user has set goals)
  23. Encourage adoption of other GitLab features by providing a fact in the adoption table such as the top 10% of highest performing teams in your organization are using Feature x, y, and z, with links to each of the top features.
  24. Nice to Have - Provide tips on reducing cycle time, such as "Team x adopted Feature x in Month x and experienced % increase in Performance Metric x
  25. Nice to Have - Provide a comparison of VSA/DORA metrics with other instances (similar to DevOps Score) to benchmark performance
  26. Nice to Have - Add some broader benchmarking statistics and comparisons (from industry reports, etc)

Phases

  • Add a set of MVC steps

Challenges we'll need to address

  • Limited accessibility of instance-level analytics under current architecture
  • Timing of solutions to the instance-level group and proposal to merge projects and group
  • Database performance concerns

Tier strategy

TBD in gitlab-org/gitlab#322660 (closed)

Measuring our success

Evangelism plan

Next steps

  • Internal review (see gitlab-org/manage/general-discussion#17315)
  • Customer validation
  • UX mockups
  • Internal review by ...
  • Create some epics and link to the appropriate maturity epic
  • Schedule out first 3 milestones with key feature work
  • Get continuous customer feedback

Why focus on this over other analytics features?

What we won't do in FY22

Vision beyond FY22 (subject to change)

  • Add more automated suggestions based on trends
  • Predictive analytics (e.g. based on past performance, you can deliver your feature % faster by doing x)
  • More support for adding goals/OKRs and tracking progress
  • Integrations
Edited by Larissa Lane