Difference between revisions of "Ollama"
(Created page with "Ollama is a locally based AI system. ==Download== https://ollama.com/download ==List Of Models== https://ollama.com/search ==Commands== *List installed models: ollama list...") |
(→More Information) |
||
Line 17: | Line 17: | ||
==More Information== | ==More Information== | ||
− | https://www.restack.io/p/ollama-answer-exit-cat-ai | + | *https://github.com/ollama/ollama |
+ | *https://www.restack.io/p/ollama-answer-exit-cat-ai |
Revision as of 04:32, 22 January 2025
Ollama is a locally based AI system.
Download
List Of Models
Commands
- List installed models: ollama list
- Install model: ollama pull llama3.1:latest
- Run model: ollama run llama3.1:latest
- Exit conversation: **`:quit`** or **`:exit`**
- Stop Ollama: systemctl stop ollama
- Start Ollama: systemctl start ollama
- Restart Ollama: systemctl restart ollama