Skip to content

Follow-ups to improve feature discovery page in Pajamas

Context

This issue outlines a series of MRs to rework the existing feature discovery guidance.

Click to see problems to solve

My pain points with the current guidelines:

  • Each time I add feature discovery, I don't know what UI element to choose or what's "allowed", e.g. "can I show a popover without user action?"
  • Designing feature discovery is manual effort each time, lots of individual decision making on the designer
  • I don't know what pattern most effective or appropriate for my use case
  • We are quite inconsistent how we handle feature discovery
  • GitLab has a tendency to show many banners and loud notices, difficult to argue against adding a banner with current guidance
Click to see goal
  • Guidelines are more ranked guidance, with a lot of emphasis on making sure the feature is in the right place in the user's workflow.
  • Discourage disruptive feature discovery guidance.
  • Guidelines are more actionable and concrete
Click to see explorations

I tried to explore this by reading external research, other work done at GitLab, collating patterns from other products, and even running some small scale testing to investigate the impact of CTA position and style of feature discovery mechanism. (I question the validity of small scale testing due to the small sample size.)

I also collected some feedback on a draft MR

Click to see related links

Similar efforts

Internal UXR

External guidance

Survey results

I ran two surveys with GitLab staff to learn more about general challenges with feature discovery and also feedback on the Pajamas page.

Note: the feedback about the Pajamas page (second survey) was collected after the page was already rewritten (!4090 (merged)). General feedback about feature discovery survey was before the Pajamas page was rewritten.

General feedback summary

  • Full responses in Google sheet
  • Biggest challenges when introducing new features are:
    • Conflicting priorities or resource constraints
    • Technical difficulties and limitations
  • Multiple responses mention analytics challenges - what to measure, how to measure it, technical challenges
Click to see quotes from survey

Instrumentation - while I treat it as a priority having some level of instrumentation, having good enough instrumentation is very difficult, especially in a domain with a huge amount of poorly instrumented features; even when instrumentation is delivered, adding it to dashboards requires manual steps, getting a tier-based breakdown on gitlab.com requires separate work (from Self-Managed). Btw, gitlab.com helps with measuring adoption quickly and allows for better instrumentation, but overall requires different instrumentation than Self-Managed.

Releasing a feature as part of GitLab. Even defining the metrics, but implementing them as I'd like it is typically a challenge, especially for Self-Managed.

No consistent measurement. I think having good requirements and acceptance criteria would be a good start so that everyone could at least know if we satisfied the goals we set out for.

I'd like to have more interactions with end users like Cursor and GitHub do with their forums. We tried adding a link to the GitLab Forum from our Duo Status Menu but so far there's been limited engagement.

A better pattern than persistent banners — maybe more like broadcast messages, but targeted to feature announcements — would be useful. IMO one of the considerations with feature promotion needs to be how we define and batch features. Historically we've prioritized releasing many small updates over big updates, which results in many changes to process and higher noise overall (not always feature additions, but users will see all as change). There are downsides to holding things to a batch, but a significant upside is it can mean one big and more impactful release that we can tie a lot more discovery and guidance to all at once and reduce the feeling of constant change that appears in our SUS/CSAT verbatims.

Feedback about Pajamas page summary

  • Full responses in Google sheet
  • Desire for more examples, more visuals
  • Would like more concrete examples, like is the "new" badge a recommended pattern? What is a recommended pattern?
  • Could cross-link other pages, such as progressive disclosure

Improvements

Move onboarding to its own page

Changes in !4090 (merged) changed the tone and focus of this page. Previously the onboarding section related to other page content. Now it's quite different - in tone, and conceptually. I suggest moving the onboarding section to its own page.

Rework onboarding content

Feature discovery and onboarding are related concepts. Currently the onboarding content does not match the feature discovery content completely. It could be modified to use similar language and remove duplicated content, such as the "Patterns for initial prompts" table - this table could be removed and link to the feature discover page.

Improve feature discovery content

Based on the survey, users would like:

  • Desire for more examples, more visuals
  • Would like more concrete examples, like is the "new" badge a recommended pattern? What is a recommended pattern?
  • Could cross-link other pages, such as progressive disclosure
  • Could show the same feature with all three types of feature discovery (implicit, contextual, disruptive) as an example. Full context in this comment

Principles feature discovery: Validating implicit discovery as a first step

The guidelines state to explore ways in which the feature is not actively promoted. Challenge teams to articulate why they need feature discovery. Why are users not engaging with your feature? How do you know? Is the CTA label clear? Do users see the button? Do users understand what it will do? Could you test alternatives in a quick way, such as usertesting.com?

Guidelines for validating that feature discovery is necessary and articulating why.

Principles feature discovery: defining a goal, measuring if the goal was met

  • Articulating what the goal of feature discovery is (what are you trying to accomplish)
  • Which feature discovery mechanism is best suited to that goal
  • How to measure if the goal is met (what to measure, over what time period)

Defining a criteria for removing notices

  • Defining a criteria based on UI component, like:
    • Banners are not shown after user dismissal, if user does not dismiss the banner remove it after n weeks
    • Hotspots are not shown after again user dismissal, if user does not engage with the hotspot, remove after n weeks
  • Self-managed considerations: how do you account for the removal of feature discovery notices when the user does not upgrade their instance?
Edited by Katie Macoy