Difference between revisions of "Ollama"
(Created page with "Ollama is a locally based AI system. ==Download== https://ollama.com/download ==List Of Models== https://ollama.com/search ==Commands== *List installed models: ollama list...") |
(→Commands) |
||
(One intermediate revision by the same user not shown) | |||
Line 15: | Line 15: | ||
*Start Ollama: systemctl start ollama | *Start Ollama: systemctl start ollama | ||
*Restart Ollama: systemctl restart ollama | *Restart Ollama: systemctl restart ollama | ||
+ | ==Web GUI== | ||
+ | https://github.com/open-webui/open-webui | ||
==More Information== | ==More Information== | ||
− | https://www.restack.io/p/ollama-answer-exit-cat-ai | + | *https://github.com/ollama/ollama |
+ | *https://www.restack.io/p/ollama-answer-exit-cat-ai |
Latest revision as of 04:34, 22 January 2025
Ollama is a locally based AI system.
Download
List Of Models
Commands
- List installed models: ollama list
- Install model: ollama pull llama3.1:latest
- Run model: ollama run llama3.1:latest
- Exit conversation: **`:quit`** or **`:exit`**
- Stop Ollama: systemctl stop ollama
- Start Ollama: systemctl start ollama
- Restart Ollama: systemctl restart ollama
Web GUI
https://github.com/open-webui/open-webui