Skip to content
Snippets Groups Projects

Feature tracking guidance for PMs

Merged Emma Fergen requested to merge efergen-feature-tracking-guidance into main
6 unresolved threads
@@ -64,6 +64,107 @@ If you ever get stuck or have a question, please ask for help in the [#data slac
If needed, you may create an issue in the [Product Data Insights project](https://gitlab.com/gitlab-data/product-analytics/-/issues/new) and assign it to a [product data analyst](/handbook/product/groups/product-analysis/#team-members). You can read more about working with the PDI team [here](/handbook/product/groups/product-analysis/#working-with-us).
## Guidance for Instrumenting Feature Tracking
As a GitLab PM, you're responsible for defining and tracking metrics for your team's features. This guide will walk you through the process, tools, and resources available to help you succeed.
### Quick links for Instrumenting Feature Tracking
- [CLI generator to automatically create event and metric definition files](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/quick_start.html#defining-event-and-metrics): An interactive CLI that gathers your requirements, automatically generates event and metric definition files, and produces ready-to-use instrumentation code for engineers to implement and test
- [Quick Start Guide for Internal Event Tracking](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/quick_start.html): Comprehensive instructions on how to instrument event tracking and context around GitLab's internal tracking system
- [Getting Started Standard Context Fields](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/standard_context_fields.html): Documentation on each standard context field included in Internal Event Tracking and descriptions of their intent
- [Usage Data Instrumentation Issue Template](https://gitlab.com/gitlab-org/gitlab/-/issues/new?issuable_template=Usage%20Data%20Instrumentation): Issue template for product managers or engineering teams looking to track usage of their features
- [Product Data Insights Performance Indicator Chart Issue Template](https://gitlab.com/gitlab-data/product-analytics/-/issues/new?issuable_template=PI%2520Chart%2520Help)
- [Product Data Insights Ad Hoc Analysis Issue Template](https://gitlab.com/gitlab-data/product-analytics/-/issues/new?issuable_template=Ad%2520Hoc%2520Request)
### Self-Service Feature Tracking Dashboards
If your analytics needs for your new or recently modified feature are met by these dashboards, you can skip creating a Product Data Insights (PDI) Issue:
- [PD: Centralized Product Usage Metrics](https://10az.online.tableau.com/#/site/gitlab/views/DRAFTCentralizedGMAUDashboard/MetricReporting)
- [PD: Product Usage Metrics (.com & Service Ping)](https://10az.online.tableau.com/#/site/gitlab/workbooks/2478263/views)
- [PD: Firmographic Product Metric Usage](https://10az.online.tableau.com/#/site/gitlab/workbooks/2137023/views)
- [PD: Subscription Feature Usage Trends](https://10az.online.tableau.com/t/gitlab/views/PDSubscriptionFeatureUsageTrends_17032798065680)
- [AI Gateway Reporting](https://10az.online.tableau.com/t/gitlab/views/AIGatewayReporting/Overview)
### Process for Instrumenting Feature Tracking
1. Plan Your Analytics Requirements
**Owner: Product Manager**
- Start by determining what you need to measure:
- What user behaviors indicate feature success?
- What metrics will help you make product decisions?
- What data points do you need for your team's KPIs?
- If existing dashboards don't meet all of your needs, create a Product Data Insights (PDI) Issue to request additional analytics. [Product Data Insights (PDI) Issue](https://gitlab.com/gitlab-data/product-analytics/-/issues/new)
1. Create Instrumentation Issue
**Owner: Product Manager**
Option A: Use the CLI Generator to generate requirements for your Instrumentation Issue
- Use the [CLI generator tool](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/quick_start.html#defining-event-and-metrics)
- Benefits:
- Automatically generates event and metric definition files
- Produces ready-to-use instrumentation code
- Reduces implementation time for engineers
- Ensures consistency with GitLab's tracking standards
Option B: Use Usage Data Instrumentation Issue Template to outline metric requirements
- Use the [Usage Data Instrumentation Issue Template](https://gitlab.com/gitlab-org/gitlab/-/issues/new?issuable_template=Usage%2520Data%2520Instrumentation)
Tag your [assigned product analyst](/handbook/product/groups/product-analysis/#team-members) to review metric properties
1. Implement Tracking
**Owner: Engineer**
- Create an Internal Events Tracking Merge Request (MR)
- Implement new metrics according to specifications defined in Usage Data Instrumentation Issue
1. Test and Validate
**Owner: Engineer**
- Perform [local testing](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/local_setup_and_debugging.html)
- Request review from Analytics Instrumentation team member
- Verify test events match properties defined in the Issue
1. Create Analysis
**Owner: Product Analyst**
- For analyses requiring user and event grain GitLab.com data (Snowplow), data collection will be sufficient for analysis 1-2 weeks after MR merge
- For analyses requiring aggregated SM and Dedicated data (Service Ping), data collection will be sufficient for analysis 6-8 weeks after MR merge due to minimum version adoption requirement for Service Ping metrics
- Complete requirements specified in PDI Issue (if applicable)
### Special Considerations for AI Gateway Features
    • Comment on lines +137 to +138
      Suggested change
      138 ### Special Considerations for AI Gateway Features
      139
      140 When instrumenting features routed through the AI Gateway, follow these guidelines:
      138 ### Special Considerations for Duo features
      139
      140 When instrumenting Duo (AI) features follow these guidelines:
      141
      142 Duo features are automatically instrumented in the back-end when they are routed through the AI Gateway ([more details here](https://gitlab.com/groups/gitlab-org/-/epics/14792#implementation-overview)), given that the feature has been declared as a unit primitive.
      143

      suggestion to use "Duo" instead of "AI Gateway" which may be more broadly understood + additional context on how automated instrumentation works

Please register or sign in to reply
When instrumenting features routed through the AI Gateway, follow these guidelines:
1. Represent new features routed through the AI Gateway as unit primitives
- New distinct features should be represented as a [unit primitive](https://gitlab.com/gitlab-org/cloud-connector/gitlab-cloud-connector/-/tree/main/config/unit_primitives)
- To add a new unit primitive, reach out to ~"group::cloud connector"
Please register or sign in to reply
1. Set up tracking for the new unit primitive
- Contact ~"group::analytics instrumentation" with the necessary fields to be tracked
- For more information on AI Gateway instrumentation, see the [documentation](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/quick_start.html#internal-events-on-other-systems)
Please register or sign in to reply
- Once instrumented, AI Gateway events using the unit primitive framework:
- Cannot be blocked by users
- Are tracked at the event grain for all deployment types
1. For more granular reporting
- If you need more detail than a 'request' of the AI Gateway at the broad feature grain, use [Internal Events Tracking](https://docs.gitlab.com/ee/development/internal_analytics/internal_event_instrumentation/quick_start.html)
- Internal events can be connected to unit primitive events using a `correlation_id` for behavior funnel or more granular reporting use cases (GitLab.com only)
1. Viewing AI Gateway data
Please register or sign in to reply
- [AI Gateway Reporting](https://10az.online.tableau.com/t/gitlab/views/AIGatewayReporting/Overview) automatically displays new unit primitive requests once they have been instrumented by ~"group::analytics instrumentation"
    • "new unit primitive requests" --> suggesting to use something like "your new feature's usage events" for clarity

      How can users test if data is flowing? Are new events showing up in real-time as soon as a new unit primitive is created, or there is a delay?

      Maybe we could explain how to see the data in snowflake for testing purpose as well as for running more advanced analytical queries. For example, we could link to a shared query like this one where user should be able to instantly see their unit primitive and related events showing up - and use as a base to edit and run other queries.

      Edited by Sacha Guyon
Please register or sign in to reply
- Additional analytics can be requested by creating a [Product Data Insights (PDI) Issue](https://gitlab.com/gitlab-data/product-analytics/-/issues/new)
### Key Contacts and Resources
- For questions about the feature tracking process, reach out to #g_monitor_analytics_instrumentation.
- The Analytics Instrumentation team owns our internal product feature tracking system.
By following this process and understanding the roles involved, PMs can effectively instrument and track metrics for their features, enabling data-driven decision-making and product improvement.
## Key Data Sources for Product Managers at GitLab
We have three primary data sources for product usage data:
Loading