VSD UX: Design exploration for DORA metrics
About
This issue is a follow up from gitlab#426140 (closed).
The purpose of this issue is to expand upon and validate the design work that has been done (~1 year ago) on visualizing DORA metrics at the group level within the Value Stream Dashboard.
DORA-specific questions that the designs should answer:
- Why are DORA metrics important?
- Which of my team's scores are lower than expected? (I can work with teams that need attention)
- Which of my team's scores are higher? (I can dig in to capture best practices)
- How do scores trend over time? Am I improving or getting worse? Dissecting month over month/ quarter over quarter
- What trends are considered "good" or "bad"?
- Which teams are improving/declining? Drilling in team by team
- How can I improve my scores?
- How do my organization's scores at this time compare to industry standards?
Personas
- Dakota (Application Development Director)
- Erin (Application Development Executive)
- Delaney (Development Team Lead)
Designs
- See Design management for latest mockups
- ❖ Figma project →
Research issue (WIP)
-
🔗 Research issue: https://gitlab.com/gitlab-org/ux-research/-/issues/2877+ -
🔗 Items to validate
Resources
- https://cloud.google.com/blog/products/devops-sre/using-the-four-keys-to-measure-your-devops-performance
- https://codefresh.io/learn/software-deployment/dora-metrics-4-key-metrics-for-improving-devops-performance/
- https://docs.gitlab.com/ee/user/analytics/dora_metrics.html
Questions a dashboard should be answering:
- Are we ahead or behind our goals?
- By how much are we ahead or behind?
- Did that just happen, or has it been going on for a while?
- Which areas of the organization are doing well?
- Which are doing poorly?
- How are we doing this period vs. last period?
- How about this same time last year?
- How is our organization doing compared with other organizations?
Edited by Libor Vanc