Semantic Code Search and Code Embeddings - `Gitlab.com` - Beta release and feedback
## Context
This is the announcement and feedback issue for the **Beta Release** of the **Semantic Code Search** feature.
This applies to the **[`semantic_code_search` MCP tool](https://docs.gitlab.com/user/gitlab_duo/model_context_protocol/mcp_server/#semantic_code_search)** used in **Agentic Chat**, though you may use other MCP clients (e.g. [Claude Code](https://docs.gitlab.com/user/gitlab_duo/model_context_protocol/mcp_server/#connect-claude-code-to-a-gitlab-mcp-server)) instead of Agentic Chat.
With the Beta Release, the `semantic_code_search`:
- will be available to **all users that have access to the [Gitlab MCP Server](https://docs.gitlab.com/user/gitlab_duo/model_context_protocol/mcp_server)**
- can be invoked for **all projects with [Duo availability turned on](https://docs.gitlab.com/user/gitlab_duo/turn_on_off/#turn-gitlab-duo-on-or-off)**
- This requires a project's files to be indexed into Code Embeddings.
- If this tool is invoked for projects without Code Embeddings yet, indexing will be triggered ad-hoc and the LLM Agent will be instructed to use a different tool. The semantic search will be available when invoked again after a few minutes.
Related issue: https://gitlab.com/gitlab-org/gitlab/-/issues/569746+
## Setup links
- [Agentic Chat](https://docs.gitlab.com/user/gitlab_duo_chat/agentic_chat/)
- [Instructions for connecting to any MCP Server from any MCP Client](https://docs.gitlab.com/user/gitlab_duo/model_context_protocol/mcp_clients/)
- note: Agentic Chat / Duo Agent Platform is considered an MCP client
- [Configuration instructions for connecting to the Gitlab MCP Server](https://docs.gitlab.com/user/gitlab_duo/model_context_protocol/mcp_server/)
## Example Usage Scenarios on Agentic Chat
1. **semantic code search performed on the Workspace's active project**
<details>
<summary>Expand for screen recording</summary>
{width=819 height=600}
</details>
2. **semantic code search performed on a project specified in the user question**
<details>
<summary>Expand for screen recording</summary>
{width=819 height=600}
</details>
3. **semantic code search performed on a project without Code Embeddings yet**
1. The LLM Agent will try to use other tools to answer the user's question.
<details>
<summary>Expand for screen recording</summary>
{width=819 height=600}
</details>
2. Meanwhile, Code Embeddings Indexing will be triggered ad-hoc behind the scenes
3. After 10 - 20 minutes, semantic code search should be available on the project
<details>
<summary>Click to expand</summary>
{width=819 height=600}
</details>
4. **semantic code search performed on a project included through `/include repository`**
This is not available in the Beta Release. Please see below for further details.
## Excluding `/include repository` for the initial Beta Release
The `/include repository` command is used by both the **Agentic Chat** and **Classic Chat**.
Given that **[Classic Chat is already Generally Available](https://docs.gitlab.com/user/gitlab_duo_chat/)**, we cannot enable this command without making it generally available to all users using Classic Chat. Additionally, this command has a pending improvement issue required for it to list projects that have _Duo availability turned on_ (see https://gitlab.com/gitlab-org/gitlab/-/issues/576635 & https://gitlab.com/gitlab-org/editor-extensions/gitlab-lsp/-/issues/1629)
For these reasons, we will skip the `/include repository` command in the Beta Release. Instead, it will be rolled out as **General Availability release for Classic Chat** and a **follow-up to the Beta Release for Agentic Chat**.
### Example Usage Scenarios of the `/include repository` command
1. **AGENTIC CHAT:** `semantic_code_search` performed on a project specified through `/include repository`
<details>
<summary>Expand for screen recording</summary>
{width=819 height=600}
</details>
1. **CLASSIC CHAT:** semantic search performed on a project specified through `/include repository`
Selecting a project through `/include repository` is the only way semantic search can be performed on Classic Chat. This is different from Agentic Chat, where the LLM can perform semantic search depending on the user prompt.
<details>
<summary>Expand for screen recording</summary>
{width=819 height=600}
</details>
issue