UX Scorecard (Part 1)- Secure FY21-Q3 - Day-1 Experience for people using Security features
Summary of the results of this UX scorecard study
When I am going to the security dashboard page for the first time, I want to know all the functions of the dashboard, so set up the dashboard which is useful to me.
When I am going to the empty security dashboard page, I want to set up the easily/quickly, so I can monitor my project by using the dashboard.
As a result, I think the user faces certain difficult moment encounter the “Not configured” status, no clear message says what is going on, and no clear instruction on what to do next. If a user were at group level dashboard, it seems more difficult than project level, project level has a “configuration page”. But anyway, user can manually setup or use the configure page if they find it. So I gave it a Neutral and C(Average experience need to be improved) as the overall score.
Neutral: The user’s expectations were met. Each action provided the basic expected response from the UI so that the user could complete the task and move forward. Emotion(s): Indifferent
C (Average) Workflow needs improvement, but the user can still finish completing the task. It usually takes longer to complete the task than it should. User may abandon the process or try again later. Frustration: Medium Task Completion: Successful but with unnecessary steps Steps to Complete Task: Average complexity
Follow up: Recommendation issue and revisit issue:
- Recommendation has been Summarized in this issue
- After a few months, we should revisit it to see experience has been improved or not, please see the issue
UX Scorecard Checklist
Link to handbook page page about UX Scorecard.
- Mention which personas might be performing the job. Keeping personas in mind allows us to use the correct language and make the best decisions to address their specific problems and pain points when writing recommendations.
- If your JTBD spans more than one stage group, that’s great! Review your JTBD with a designer from that stage group for accuracy.
- Review the current experience, noting where you expect a user's high and low points to be. Capture the screens and jot down observations.
- It's also advised that you ask another person (internal or external) relatively new to the workflow to accomplish the JTBD. Record this session, and document their experience of the JTBD. Note that an additional user isn't currently required, but can provide valuable insights that you might not have thought of. Depending on how complex the JTBD is, and how familiar the task is to you, you can invite additional participants so you can get a broad view of the JTBD. If you approach this as a usability study and follow a process approved by a UX Researcher, you may apply an appropriate research label.
Using what you learned in the previous steps, apply the following Emotional Grading Scale to document how a user likely feels at each step of the workflow. Add this documentation to each JTBD issue's description.
- Positive: The user’s experience included a pleasant surprise— something they were not expecting to see. The user enjoyed the experience on the screen and could complete the task, effortlessly moving forward without having to stop and reassess their workflow. Emotion(s): Happy, Motivated, Possibly Surprised
- Neutral: The user’s expectations were met. Each action provided the basic expected response from the UI so that the user could complete the task and move forward. Emotion(s): Indifferent
- Negative: The user did not receive the results they were expecting. There may be bugs, roadblocks, or confusion about what to click on that prevents the user from completing the task. Maybe they even needed to find an alternative method to achieve their goal. Emotion(s): Angry, Frustrated, Confused, Annoyed
- Use the Grading Rubric below to provide an overall measurement that becomes the Benchmark Score for the experience (one grade per JTBD) and add it to each JTBD issue's description. Document the score in the UX Scorecard Spreadsheet.
- Once you’re clear about the user’s path, create a clickthrough video that documents the existing experience. Begin the video with a contextual introduction including your role, stage group, and a short introduction to your JTBD and purpose of the UX scorecard. This is not a "how-to" video, but instead should help build empathy for users by clearly showing areas of potential frustration and confusion. (You can point out where the experience is positive, too.) The Emotional Grading Scale you documented earlier will help identify areas to call out. At the end of the video, make sure to include narration of the Benchmark Score. Examples here and here
- Post your video to the GitLab Unfiltered YouTube channel and link to it from each JTBD issue's description.
- Link to your video in the [Engineering Week in Review] (https://docs.google.com/document/d/1Oglq0-rLbPFRNbqCDfHT0-Y3NkVEiHj6UukfYijHyUs/edit#heading=h.wl5oryd6kv3u).
- Create an issue to revisit the same JTBD the following quarter to see if we have made improvements. We will use the grades to monitor progress toward improving the overall quality of our user experience. Add that issue as related to each JTBD issue.