Ollama
Ollama is a locally based AI system.
Download
List Of Models
Commands
- List installed models: ollama list
- Install model: ollama pull llama3.1:latest
- Run model: ollama run llama3.1:latest
- Exit conversation: **`:quit`** or **`:exit`**
- Stop Ollama: systemctl stop ollama
- Start Ollama: systemctl start ollama
- Restart Ollama: systemctl restart ollama
Web GUI
https://github.com/open-webui/open-webui