feat(chat): add model metadata to chat endpoint
What does this merge request do and why?
Related issue: Add self-hosted models support for v2/agents (#513 - closed)
This MR adds support for custom models to v2/chat/agent
endpoint: it's model_metadata
field added to the options
.
Example:
{
"prompt": "string",
"options": {
"chat_history": "string",
"agent_scratchpad": {
"agent_type": "react",
"steps": [
{
"thought": "string",
"tool": "string",
"tool_input": "string",
"observation": "string"
}
]
},
"context": {
"type": "issue",
"content": "string"
},
"model_metadata": {
"endpoint": "http://127.0.0.1:11434/v1",
"name": "mistral",
"provider": "openai",
"api_key": "string"
}
}
Testing
Please add the following diff on top of this MR: Execute chat requests via new endpoint in AI Ga... (gitlab-org/gitlab!150529 - merged)
Diff
diff --git a/ee/lib/gitlab/llm/chain/requests/ai_gateway.rb b/ee/lib/gitlab/llm/chain/requests/ai_gateway.rb
index 3035087ffc2d..7c9065dbfdbe 100644
--- a/ee/lib/gitlab/llm/chain/requests/ai_gateway.rb
+++ b/ee/lib/gitlab/llm/chain/requests/ai_gateway.rb
@@ -127,6 +127,19 @@ def model_params(options)
end
end
+ def model_metadata_params(options)
+ return unless chat_feature_setting&.self_hosted?
+
+ self_hosted_model = chat_feature_setting.self_hosted_model
+
+ {
+ provider: :openai, # for self-hosted models we support Messages API format at the moment
+ name: self_hosted_model.model,
+ endpoint: self_hosted_model.endpoint,
+ api_key: self_hosted_model.api_token
+ }
+ end
+
def request_body_chat_2(prompt:, options: {})
{
prompt: prompt,
@@ -139,7 +152,8 @@ def request_body_chat_2(prompt:, options: {})
context: {
type: options[:current_resource_type],
content: options[:current_resource_content]
- }
+ },
+ model_metadata: model_metadata_params(options)
}
}
end