AOE Technology RadarAOE Technology Radar

Ollama

aicoding
Trial

Running large language models locally?

Downloading Ollama and typing ollama run llama3 is all you need.

Ollama is great for running various open source (open weight) models locally and interacting with them. You can do this either via the command line or via the Ollama API.

Ollama takes care of downloading and running models, and it supports the specification of your own model packages in a "Modelfile".

At AOE, we use it for local development and testing.