Skip to content

Add LiteLLM client to handle requests to self-managed OpenAI compatible models

Igor Drozdov requested to merge id-openai-compatible-client into main

What does this merge request do and why?

Mistral and Mixtral models are expected to be deployed behind OpenAI compatible API server (like vLLM). Let's add a handler for this type of requests.

Related issues:

Edited by Igor Drozdov

Merge request reports