"AI Impact" analytics - FAQ
Table of Contents
- Where can I find a demo of AI Impact Analytics?
- On which deployment types can AI Impact Analytics be configured?
- What metrics are available in the AI Impact Analytics?
- What APIs are available for retrieving AI Impact data?
- Which AI Impact API should I use?
- What is the Data flow for AI Impact Analytics?
- How frequently is the data on the AI Impact dashboards updated on ~"gitlab.com" ?
- Are code suggestion events from the Web IDE currently being tracked?
- How to create scheduled reports?
- What's next and why?
- Additional Resources
- Internal FAQ
Where can I find a demo of AI Impact Analytics?
You can view a click-through demo in the AI Impact Analytics product tour.
On which deployment types can AI Impact Analytics be configured?
AI Impact depends on a ClickHouse database, and results in different configuration options for each deployment type:
- GitLab.com: AI Impact Analytics is enabled by default.
- Self-Managed with ClickHouse: Organizations with an accessible ClickHouse instance can enable the AI Impact by configuring ClickHouse for contribution analytics within their GitLab self-managed instance.
- Self-Managed without ClickHouse: If ClickHouse is not available, the AiUsageData API can be used to export raw event data for analysis in an external BI tool.
- Dedicated: Short-term, the AiUsageData API is available for data exports. Long-term, AI Impact Analytics will be supported—details here.
What metrics are available in the AI Impact Analytics?
The AI Impact Analytics Metrics Matrix provides a detailed list of AI Impact metrics, including the supported dimensions and planned roadmap items.
What APIs are available for retrieving AI Impact data?
You can retrieve AI impact metrics using the following GraphQL APIs:
- AiMetrics (requires ClickHouse)
- AiUserMetrics (requires ClickHouse)
- AiUsageData (does not require ClickHouse)
Which AI Impact API should I use?
Here is the AI Impact API feature breakdown:
Feature |
AiUsageData |
AiUserMetrics |
AiMetrics |
Data Storage | PostgreSQL | ClickHouse | ClickHouse |
Purpose |
Provide the raw event data and designed for customers without ClickHouse to gain insights into AI usage. |
Query pre-aggregated per-user metrics for code suggestions and Duo Chat via the GraphQL API | Query pre-aggregated group-level metrics for code suggestions and Duo Chat via the GraphQL API |
Use case | Import events into BI tool or write your scripts to aggregate the data, acceptance rates, and per-user metrics for code suggestion events. | Quickly get a list of Duo users and frequency of usage for Code Suggestions and Chat. |
|
Per-user metrics | Yes | Yes | No |
Retention Period | 3 months | Based on ClickHouse configuration | Based on ClickHouse configuration |
Events types |
Code suggestions - suggestion size, language, user and type ( |
Metrics are limited to: |
|
Seat Requirement | |||
Minimum version |
|
17.6 |
Gitlab Duo Enterprise - 17.4 GitLab Duo Pro - 17.6 |
APIs Reference & Sample query:
You can explore the GraphQL API resources with the interactive GraphQL explorer.
AIUsageData Reference & Sample query
- GraphQL Reference: https://docs.gitlab.com/ee/api/graphql/reference/index.html#aiusagedata
- Users running this API must have a Duo Enterprise seat assigned.
- Query Example for AiUsageData:
query {
group(fullPath: "gitlab-org") {
aiUsageData {
codeSuggestionEvents {
nodes {
event
timestamp
language
suggestionSize
user {
username
}
}
}
}
}
}
- Example Output for AiUsageData:
{
"data": {
"group": {
"aiUsageData": {
"codeSuggestionEvents": {
"nodes": [
{
"event": "CODE_SUGGESTION_SHOWN_IN_IDE",
"timestamp": "2024-12-22T18:17:25Z",
"language": null,
"suggestionSize": null,
"user": {
"username": "jasbourne"
}
},
{
"event": "CODE_SUGGESTION_REJECTED_IN_IDE",
"timestamp": "2024-12-22T18:13:45Z",
"language": null,
"suggestionSize": null,
"user": {
"username": "jasbourne"
}
},
{
"event": "CODE_SUGGESTION_ACCEPTED_IN_IDE",
"timestamp": "2024-12-22T18:13:44Z",
"language": null,
"suggestionSize": null,
"user": {
"username": "jasbourne"
}
}
]
}
}
AiUserMetrics Reference & Sample query
- Graphql Reference: https://docs.gitlab.com/ee/api/graphql/reference/index.html#aiusermetrics
- Users running this API must have a Duo Enterprise seat assigned.
- aiUserMetrics sample query:
query {
group(fullPath:"gitlab-org") {
aiUserMetrics {
nodes {
codeSuggestionsAcceptedCount
duoChatInteractionsCount
user {
username
}
}
}
}
}
- aiUserMetrics Output
{
"data": {
"group": {
"aiUserMetrics": {
"nodes": [
{
"codeSuggestionsAcceptedCount": 10,
"duoChatInteractionsCount": 22,
"user": {
"username": "USER_1"
}
},
{
"codeSuggestionsAcceptedCount": 12,
"duoChatInteractionsCount": 30,
"user": {
"username": "USER_2"
}
}
]
}
}
}
}
AiMetrics reference & Sample query
- GraphQl reference https://docs.gitlab.com/ee/api/graphql/reference/index.html#aimetrics
- aiMetrics sample query:
query {
group(fullPath: "gitlab-org") {
id
aiMetrics(startDate: "2024-12-01", endDate: "2024-12-31") {
codeContributorsCount
codeSuggestionsContributorsCount
codeSuggestionsShownCount
codeSuggestionsAcceptedCount
duoChatContributorsCount
duoAssignedUsersCount
duoUsedCount
}
}
}
- aiMetrics Output:
{
"data": {
"group": {
"id": "gid://gitlab/Group/9970",
"aiMetrics": {
"codeContributorsCount": 4000,
"codeSuggestionsContributorsCount": 5000,
"codeSuggestionsShownCount": 6123,
"codeSuggestionsAcceptedCount": 719,
"duoChatContributorsCount": 544,
"duoAssignedUsersCount": 2000,
"duoUsedCount": 746,
},
}
}
}
How can we export the AI metrics to a CSV file?
AI metrics data can be exported to a CSV file by automating data retrieval through the AI Metrics APIs. This can be done using a script that fetches and formats the data. You can review a sample script and workflow here.
What is the Data flow for AI Impact Analytics?
You can find the data diagram on groupoptimize team page.
gitlab.com ?
How frequently is the data on the AI Impact dashboards updated onCode suggestion usages should update every 5 minutes. Currently, we are using a logic that tracks only 'committed code within the last month' meaning we count code suggestion usage only if the user has pushed code to the project within the current month. This logic is designed to mitigate scaling risks.
Are code suggestion events from the Web IDE currently being tracked?
Currently, we only collect data from code editor extensions. Tracking events from the Web IDE is a work-in-progress. You can find more details here.
How to create scheduled reports?
You can create a custom AI Impact Scheduled Report to compare teams using AI versus those that don’t. Use the AI Impact view of projects and groups to analyze trends and performance.
What's next and why?
https://about.gitlab.com/direction/plan/value_stream_management/#whats-next-and-why
Additional Resources
- Blog post about Measuring AI effectiveness beyond developer productivity metrics
- Blog post about Developing GitLab Duo: AI Impact analytics dashboard measures the ROI of AI