UX Scorecard: Package FY22-Q4 - Keep registries organized and manage storage costs
- Personas: Devon, DevOps engineer, Sam, security analyst
- Previous score and scorecard: n/a
- Benchmark score: D
- Walkthrough video: https://www.youtube.com/watch?v=ryEpc3wSzhE
- Walkthrough deck: https://app.mural.co/t/gitlab2474/m/gitlab2474/1638412595938/8877ab57518a6b11a5f7a6ec7bf138dd131038a2?sender=aginsberg6970
- Recommendations: {{add link to your recommendation issue/s}}
JTBD
When my organization stores dependencies in a registry, I need tools to keep the registries organized and manage the storage costs, so I can balance a stable product development environment for my team with storage related costs.
Context
Package and Container registries allow teams to store dependencies in GitLab. A GitLab registry works similarly to public registries such as npm (for software packages) or DockerHub (for images). Teams may choose to use GitLab's registries because:
- The code and dependencies are stored in one place
- Storing dependencies on GitLab reduces reliance on third party systems that the team doesn't control.
- GitLab registries can be private, which can be important as a security/IP consideration (e.g. company developing new technology, highly regulated industries like banking..)
- Easy integration with other parts of GitLab
More information
- Very simple explanation of what an image & container registry from Red Hat
- Simple explanation of package & package registry from packagecloud
Scenario
You're responsible for optimizing processes at your organization. Your organization stores dependencies both the GitLab Container Registry and Package Registry. You want to investigate the registries to understand how much storage is being used. You want to remove items from the registries that are unused or no longer needed to reduce storage costs. Removing old dependencies is also a security consideration, as showing outdated dependencies in the registry can cause confusion and bugs.
At your organization:
- All packages published from
main
should always be kept. - The latest version of a package should always be kept.
- All packages (except those protected by rules 1 & 2) older than 2 weeks old should be removed.
- All image tags named "latest" should always be kept.
- All image tags (except for those protected by rule 4) older than 2 weeks old should be removed.
Tasks to complete scenario
- Try to determine storage used by items in the package & container registry and then total storage used by each registry.
- Remove unused items from the registries based on your organization's rules
Project to complete scorecard
https://gitlab.com/gitlab-org/ci-cd/package-stage/feature-testing/ux-scorecard-metadata Anything in this project can be deleted
Heuristic Buddy UX Scorecard Checklist
The Heuristic Buddy UX Scorecards are a twist on our UX Scorecard process. These are specifically designed to help identify areas of usability and learnability improvements. They are to be completed by a designer who does not work within the same product area(s) the job can be completed in. Learn more about UX Scorecards
The initial preparation is completed by the Group Product Designer. When the preparation has been completed they will hand it over to the Heuristic Buddy to complete the evaluation who will hand it back to the Group Product Designer when completed to add any recommendations. Read through the steps below for details.
Group Product Designer (Expert)
-
Add this issue to the stage group epic for the corresponding UX Scorecards. Verify that the UX scorecard
andOKR
labels are applied, then apply yoursection
andgroup
labels as well. -
After working with your PM to identify a top job, write it using the Job to be Done (JTBD) format: When [situation], I want to [motivation], so I can [expected outcome]
. Review with your manager to ensure your JTBD is written at the appropriate level. Remember, a JTBD is not a user story, it should not directly reference a solution and should be tool agnostic. -
Create script scenario(s) based on your JTBD. The number of scenarios used per job statement often depends on the complexity of the features tested. - Tip 1: You might find job statements to be too broad to serve as guidance for writing script scenarios. If that is the case, consider breaking the job statements down into user stories as an intermediary step. Then go back to draft your script scenario.
- Tip 2: Keep in mind your buddy may be missing the subject matter knowledge needed to understand the script scenario. If needed, offer a brief, high-level overview of the job to give them context. Avoid going into details about how to perform tasks within GitLab.
-
Make note of which personas might be performing the job, and link to them from this issue's description. Keeping personas in mind allows us to make the best decisions to address specific problems and pain points. Note: Do not include a persona in your JTBD format, as multiple types of users may complete the same job. -
Describe the characteristics this persona may impart if they were a new user for this job and the GitLab environment they will be joining. Consider that it most likely is not an empty group/project but instead could be an active team with multiple groups and repositories.
-
-
If your JTBD spans more than one stage group, that’s great! Review your JTBD with a designer from that stage group for accuracy. Note: This stage group's designer cannot be your Heuristic Buddy. -
Ping your Heuristic Buddy and let them know it's ready for them to conduct the evaluation. -
Work with your Heuristic Buddy to ensure they'll be evaluating GitLab in the correct environment setup that is appropriate to a new user attempting to complete the JTBD that you've selected. This environment should attempt to replicate the most realistic scenario that's appropriate for your persona in a "new user" state. This may not be a brand new/empty project.
Heuristic Buddy (Evaluator)
-
Review the current experience, noting where you expect a user's high and low points to be based on our UX Heuristics. Using an experience map, such as the one found in this template, capture the screens and jot down observations. - During the evaluation strive to wear the hat of the persona relevant to the JTBD and while doing so try to see the UI from their perspective as if they were a new user.
- As you progress through your evaluation this will be easy to forget so it's recommended to put a reminder somewhere in your view, such as a post-it stuck on your monitor that says "You're a new user!"
-
Use the Grading Rubric to provide an overall measurement that becomes the Benchmark score for the experience (one grade per JTBD), and add it to this issue's description. Document the score in the UX Scorecard Spreadsheet. -
Once you've completed your evaluation, create a walkthrough video that documents what you experienced when completing the job in GitLab. Begin the video with a contextual introduction including: - Your role, stage group
- Specify how you conducted the heuristic evaluation
- Add a short introduction describing the JTBD and the purpose of the UX scorecard (i.e. you're performing the evaluation in partnership with {stage group} and {product designer}.
- This is not a "how-to" video, but instead should help build empathy for users by clearly showing areas of potential frustration and confusion. (You can point out where the experience is positive, too.)
- At the end of the video, make sure to include narration of the Benchmark Score. Examples here and here.
- The walkthrough video shouldn't take you long to create. Don't worry about it being polished or perfect, it's more important to be informative.
-
Post your video to the GitLab Unfiltered YouTube channel, and link to it from this issue's description. -
Link to your video in the Engineering Week in Review. -
Once the evaluation has been completed ping the Stage Group Product Designer in this issue letting them know it's ready for their review and recommendation creation.
Group Product Designer (Expert) - Recommendation Creation
-
Collaborate with your Heuristic Buddy to create recommendation issues as needed -
Add a UX scorecard-rec
andOKR
label on every issue for traceability, then apply yoursection
andgroup
labels as well. -
Add Severity labels to every issue for prioritization -
Link your recommendation issues to your main UX Scorecard issue - Tip 1: Brainstorm opportunities to fix or improve areas of the experience.
- Use the findings from the Emotional Grading scale to determine areas of immediate focus. For example, if parts of the experience received a “Negative” Emotional Grade, consider addressing those first.
- Tip 2: Think iteratively, and create dependencies where appropriate, remembering that sometimes the order of what we release is just as important as what we release.
- If you need to break recommendations into phases or over multiple milestones, create multiple epics and use the Category Maturity Definitions in the title of each epic: Minimal, Viable, Complete, or Lovable.