Difference between revisions of "Ollama"
(→More Information) |
(→Commands) |
||
Line 15: | Line 15: | ||
*Start Ollama: systemctl start ollama | *Start Ollama: systemctl start ollama | ||
*Restart Ollama: systemctl restart ollama | *Restart Ollama: systemctl restart ollama | ||
+ | ==Web GUI== | ||
+ | https://github.com/open-webui/open-webui | ||
==More Information== | ==More Information== | ||
*https://github.com/ollama/ollama | *https://github.com/ollama/ollama | ||
*https://www.restack.io/p/ollama-answer-exit-cat-ai | *https://www.restack.io/p/ollama-answer-exit-cat-ai |
Latest revision as of 04:34, 22 January 2025
Ollama is a locally based AI system.
Download
List Of Models
Commands
- List installed models: ollama list
- Install model: ollama pull llama3.1:latest
- Run model: ollama run llama3.1:latest
- Exit conversation: **`:quit`** or **`:exit`**
- Stop Ollama: systemctl stop ollama
- Start Ollama: systemctl start ollama
- Restart Ollama: systemctl restart ollama
Web GUI
https://github.com/open-webui/open-webui