Skip to content

Draft: POC Use LiteLLM to abstract LLM providers

What does this merge request do and why?

As we expand the number of LLM providers, specially with the addition of fine tuned models, creating a class to configure each of them will start to become cumbersome. This PoC demonstrates using LiteLLM as an alternative, allowing us to connect to any supported model without much effort.

This PoC shows that replacing that replacing the anthropic model/client is pretty straightforward

How to set up and validate locally

from ai_gateway.models.litellm import LiteLLMChatModel, LiteLlmModel
from ai_gateway.models.base_chat import Message, Role
from ai_gateway.models.anthropic import KindAnthropicModel
import asyncio
model = LiteLlmModel(KindAnthropicModel.CLAUDE_3_HAIKU)

print(model.generate(prefix="Hello there!"))
model = LiteLLMChatModel(KindAnthropicModel.CLAUDE_3_HAIKU)

messages = [Message(role=Role.USER, content="Hey! how's it going?")]

print(asyncio.run(model.generate(messages=messages)))

Closes #434 (closed)

Edited by Eduardo Bonet

Merge request reports