Ollama

This add-on for AI Engine adds support for Ollama. What’s Ollama? It’s an LLM you can run on your local machine or server. It’s incredibly fast, easy to install, and makes you independent from any remote servers and services. It’s actually the closest you can get to… “Apple Intelligence”! 😬

How to use it

Once installed, you’ll find that a new Type is available in Environments for AI. Pick it, and set the Endpoint. If you installed Ollama locally (same as your WordPress), by default, this endpoint should be “http://127.0.0.1:11434“.

When this is done, you should be able to Refresh Models. By default, Ollama should come with one model, llama3. Once it is refresh, you can use this new environment anywhere in AI Engine!

Model Support

It works with all the models, but it’s recommended and well-tested with:

  • LLama: Chat, Functions
  • Llava: Chat, Image Vision
  • Mistral: Chat
  • Gemma: Chat

For more information about Ollama, please visit their official website.