AI features should have an option for using local LLMs
Feature Request
Summary
Currently, the new AI prompt feature requires a paid OpenAI account in order to use. In addition to the cost overhead, this requires a working internet connection (so it won't work offline, e.g. on an airplane), and may also present data privacy issues, as #11475 suggested. Meanwhile, it is perfectly possible to run a reasonably powerful LLM locally using free software solutions like Ollama, which could potentially allow this feature to work entirely offline, thus saving money and preserving the user's privacy.
Alternatives
Either don't use AI features at all or accept paying OpenAI and not being able to access the feature while offline?
Concerns
Ollama uses a somewhat different API than OpenAI, and requires slightly different configuration settings (basically either a server URL or hostname/port pair, which should default to http://localhost:11434
for a standard Ollama installation), so this may require a layer of abstraction around the completion API, but other than than, the prompting process is basically the same. An open source Swift library to interact with it is available here: OllamaKit
The AI settings dialogue would require an additional dropdown to allow selecting the desired provider (and perhaps the option to turn it off entirely), and would have to show different configuration options depending on what is selected.