Self-Hosted MVP: Send configured API key to AI Gateway
Overview
Within Instance-Level Self-Hosted Model serving config... (#455590 - closed) we added the ability to configure a self-hosted model (specify name, endpoint and API key). And with Self-hosted Mistral/Mixtral models for Code gen... (!153423 - merged) we started propagating the configured model_endpoint
.
Now, when feat: allow passing LiteLLM model API key (gitlab-org/modelops/applied-ml/code-suggestions/ai-assist!821 - merged) accepts an API key as well, we can start sending the API key.
Proposal
Extend the param list with an API key info, something like:
def request_params
{
model_provider: self.class::MODEL_PROVIDER,
prompt_version: self.class::GATEWAY_PROMPT_VERSION,
prompt: prompt,
model_endpoint: params[:model_endpoint]
}.tap do |opts|
opts[:model_name] = params[:model_name] if params[:model_name].present?
opts[:model_api_key] = params[:model_api_key] if params[:model_api_key].present?
end
end
And this params with something like:
def params
self_hosted_model = feature_setting.self_hosted_model
super.merge({
model_name: self_hosted_model.model,
model_endpoint: self_hosted_model.endpoint,
model_api_key: self_hosted_model.api_key
})
end
Edited by Igor Drozdov