This add-on for AI Engine adds support for Ollama. So, whatâs Ollama? Itâs a local LLM you can run right on your computer or server. Itâs fast, easy to install, and frees you from relying on remote servers or third-party services.
Think of it as the closest thing to Apple Intelligence, powered by your own machine. Whether thatâs a good thing or not⌠well, you decide! đŹ
How to use it
Once installed, youâll find that a new Type is available in Environments for AI. Pick it, and set the Endpoint. If you installed Ollama locally (same as your WordPress), by default, this endpoint should be âhttp://127.0.0.1:11434â.

When this is done, you should be able to Refresh Models. By default, Ollama should come with one model, llama3. Once it is refresh, you can use this new environment anywhere in AI Engine!
Model Support
It works with all the models, but itâs recommended and well-tested with:
- LLama: Chat, Functions
- Llava: Chat, Image Vision
- Mistral: Chat
- Gemma: Chat
For more information about Ollama, please visit their official website.