Skip to content
GitLab
Next
    • GitLab: the DevOps platform
    • Explore GitLab
    • Install GitLab
    • How GitLab compares
    • Get started
    • GitLab docs
    • GitLab Learn
  • Pricing
  • Talk to an expert
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
    Projects Groups Topics Snippets
  • Register
  • Sign in
  • GitLab Design GitLab Design
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributor statistics
    • Graph
    • Compare revisions
    • Locked files
  • Issues 199
    • Issues 199
    • List
    • Boards
    • Service Desk
    • Milestones
    • Iterations
    • Requirements
  • Merge requests 1
    • Merge requests 1
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
    • Test cases
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Code review
    • Insights
    • Issue
    • Repository
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • GitLab.orgGitLab.org
  • GitLab DesignGitLab Design
  • Issues
  • #2121
Closed
Open
Issue created Sep 22, 2021 by Kevin Chu@kbychuDeveloper

CMS Scorecard - Release: FY23-Q4 - Environment Management Viable

  • Research issue: ux-research#1935 (closed)
  • Sample Data Issue: https://gitlab.com/gitlab-org/ux-research/-/issues/2213
  • Dovetail project: Project
  • Previous score and scorecard: None
  • Walkthrough: Youtube Link
  • Recommendations: #2193
  • Research Document: Document
  • UX Sandbox Link: Sandbox Group
  • Final CMS Score: Scoresheet

Category Maturity Scorecard Checklist

Learn more about Category Maturity Scorecards

  1. Review the Category Maturity Scorecard handbook page and follow the process as described. Reach out to the UX Researcher for your stage if you have questions.
  2. Document the research data and insights in a Dovetail project using the Category Maturity Scorecard Dovetail project template.
  3. Link the Dovetail project in this issue.
  4. Document the results of each JTBD scenario using the Dovetail template.
  5. If the participant has not granted permission to share the recording publicly, ensure the sharing settings are set to GitLab-only.
  6. If needed, create a recommendation issue for these sessions.
  7. Once the research has concluded, update the issue description Outcome section with the maturity level. The outcome can be a downgrade, remain, or increase in x-maturity. For example, The CM Scorecard research has concluded and we have increased the maturity for Dependency Scanning to Complete.

Outcome

The CM Scorecard research has concluded and we have increased the maturity for Environment Management to Viable with a final score of 3.36.

Other notes

Scenario 1:

Imagine you’re an engineering manager for an enterprise company, where your team maintains over 10 projects that are deployed across multiple environments. Your team increasingly relies on integration between these projects. You’re looking to verify if all of your projects with production environments are deploying correctly. How would you go about finding this status of the production environments for all projects within this group?

  • Average UMUX Lite score for capabilities - 4.00
  • Average UMUX Lite score for ease of use - 2.60
  • How many participants were successful at the task - 5/5
  • How many participants failed the task - 0/5
  • Total number of errors each participant encountered while attempting to complete the task/scenario - 1

Scenario 2:

Imagine you’re an engineering manager at a bank and your team owns multiple critical services for your online banking. Any new releases must be audited. There is a new debit service being deployed to production and the team mentioned it will require 5 approvals, one being yours. You have checked and the deployment seems good to go. How would you go about approving the deployment?

  • Average UMUX Lite score for capabilities - 4.00
  • Average UMUX Lite score for ease of use - 2.80
  • How many participants were successful at the task - 5/5
  • How many participants failed the task - 0/5
  • Total number of errors each participant encountered while attempting to complete the task/scenario - 3

Scenario 3:

Imagine you are a software engineer for an up and coming e-commerce store. Your team deploys new features throughout the day and sometimes at night when you’re working late. Someone on your team pushed a breaking change to one of the environments in your project. You would like to rollback that breaking change to a previous successful deployment. How would you go about rolling back the deployment?

  • Average UMUX Lite score for capabilities - 4.00
  • Average UMUX Lite score for ease of use - 3.40
  • How many participants were successful at the task - 5/5
  • How many participants failed the task - 0/5
  • Total number of errors each participant encountered while attempting to complete the task/scenario - 0

JTBD

Source - https://about.gitlab.com/direction/release/environment_management/#jobs-to-be-done

  • When releasing software, I want to manage application environments across multiple teams and projects, so I can lower risks and guarantee higher organizational performance.

Personas

  • Rachel (Release Manager)
  • Delaney (Development Team Lead)
  • Sasha (Software Developer)

Documentation

  • CMS Overview Doc
Edited Jan 19, 2023 by Emily Bauman
Assignee
Assign to
Time tracking