Prompt Migration: collaboration issue to for Code Completions
Overview
Within Prompt Migration to AI Gateway (&14259) prompts are migrated from GitLab Rails to AI Gateway
groupcustom models decided to start with moving Code Suggestions prompts for custom models.
Related MRs:
- feat(agents): change prompt lookup for agent re... (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1095 - merged)
- feat(agents): code completions for custom model... (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1063 - merged)
- feat(agents): execute code generations via agent (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1096 - merged)
- Draft: feat(agents): code-generations for anthr... (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1099 - closed)
How to migrate a prompt for Code Completions
Current PoC
-
feat(agents): change prompt lookup for agent re... (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1095 - merged) is merged and it means that the prompt definition should be located in
ai_gateway/agents/definitions/<feature-section>/<feature>/<model>.yml
, for example,ai_gateway/agents/definitions/code_suggestions/completions/codegemma.yml
for Code Completions powered by Codegemma model. -
feat(agents): code completions for custom model... (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1063 - merged) is merged and
main
now containscodegemma.yml
prompt. - Currently, GitLab Rails sends the prompt. In order to call the prompt defined in AI Gateway, we need to send empty (
nil
or""
) prompt. And Add feature flag to send a nil prompt to the AI... (!160050 - merged) sends it. So in order to test the new behavior,ai_custom_models_prompts_migration
feature flag can be enabled. - Verify that LLM received the correct prompt anyway.
Other models
- Define prompt definition, for example
ai_gateway/agents/definitions/code_suggestions/completions/codestral.yml
. - Enable
ai_custom_models_prompts_migration
introduced in Add feature flag to send a nil prompt to the AI... (!160050 - merged) - Verify that LLM received the correct prompt.
Merging the work
- Define the prompt in AI Gateway
- Enable
ai_custom_models_prompts_migration
and verify it works - Perform the evaluations according to the steps defined in the issues like: Migrate Code Generation Prompts: Mistral-7B-v0.1 (#470819 - closed)
- Merge the AI Gateway MR
How to migrate a prompt for Code Generations
The steps are similar to the ones above, but the AI Gateway branch is different:
- feat(agents): execute code generations via agent (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!1096 - merged)
id-agent-for-code-generations