CMS Scorecard - Release: FY23-Q4 - Environment Management Viable
- Research issue: ux-research#1935 (closed)
- Sample Data Issue: https://gitlab.com/gitlab-org/ux-research/-/issues/2213
- Dovetail project: Project
- Previous score and scorecard: None
- Walkthrough: Youtube Link
- Recommendations: #2193
- Research Document: Document
- UX Sandbox Link: Sandbox Group
- Final CMS Score: Scoresheet
Category Maturity Scorecard Checklist
Learn more about Category Maturity Scorecards
-
Review the Category Maturity Scorecard handbook page and follow the process as described. Reach out to the UX Researcher for your stage if you have questions. -
Document the research data and insights in a Dovetail project using the Category Maturity Scorecard Dovetail project template. -
Link the Dovetail project in this issue. -
Document the results of each JTBD scenario using the Dovetail template. -
If the participant has not granted permission to share the recording publicly, ensure the sharing settings are set to GitLab-only. -
If needed, create a recommendation issue for these sessions. -
Once the research has concluded, update the issue description Outcome
section with the maturity level. The outcome can be a downgrade, remain, or increase in x-maturity. For example,The CM Scorecard research has concluded and we have increased the maturity for Dependency Scanning to Complete.
Outcome
The CM Scorecard research has concluded and we have increased the maturity for Environment Management to Viable with a final score of 3.36.
Other notes
Scenario 1:
Imagine you’re an engineering manager for an enterprise company, where your team maintains over 10 projects that are deployed across multiple environments. Your team increasingly relies on integration between these projects. You’re looking to verify if all of your projects with production environments are deploying correctly. How would you go about finding this status of the production environments for all projects within this group?
- Average UMUX Lite score for capabilities - 4.00
- Average UMUX Lite score for ease of use - 2.60
- How many participants were successful at the task - 5/5
- How many participants failed the task - 0/5
- Total number of errors each participant encountered while attempting to complete the task/scenario - 1
Scenario 2:
Imagine you’re an engineering manager at a bank and your team owns multiple critical services for your online banking. Any new releases must be audited. There is a new debit service being deployed to production and the team mentioned it will require 5 approvals, one being yours. You have checked and the deployment seems good to go. How would you go about approving the deployment?
- Average UMUX Lite score for capabilities - 4.00
- Average UMUX Lite score for ease of use - 2.80
- How many participants were successful at the task - 5/5
- How many participants failed the task - 0/5
- Total number of errors each participant encountered while attempting to complete the task/scenario - 3
Scenario 3:
Imagine you are a software engineer for an up and coming e-commerce store. Your team deploys new features throughout the day and sometimes at night when you’re working late. Someone on your team pushed a breaking change to one of the environments in your project. You would like to rollback that breaking change to a previous successful deployment. How would you go about rolling back the deployment?
- Average UMUX Lite score for capabilities - 4.00
- Average UMUX Lite score for ease of use - 3.40
- How many participants were successful at the task - 5/5
- How many participants failed the task - 0/5
- Total number of errors each participant encountered while attempting to complete the task/scenario - 0
JTBD
Source - https://about.gitlab.com/direction/release/environment_management/#jobs-to-be-done
- When releasing software, I want to manage application environments across multiple teams and projects, so I can lower risks and guarantee higher organizational performance.