Skip to content

Support for Local LLMs

Feature Request

Summary

Please add support for local LLMs, especially Ollama, which is extremely easy to install and run.

Ollama has an OpenAI compatible-API.

Alternatives

You could support other non-OpenAI endpoints as well, but they still cost like OpenAI. Local LLM's do not cost.