litellm wrapper for models

Litellm allows for easy verification of the available API KEY, and use one universal wrapper langchain chat for any llm model - api base or local like (vllm, ollama).

In order to run ollama, the url has to be provided as an input.

Merge request reports

Loading