mirror of
https://github.com/runyanjake/olomana.git
synced 2025-10-04 21:27:29 -07:00
31 lines
671 B
Markdown
31 lines
671 B
Markdown
# Chat
|
|
|
|
# Implementations
|
|
|
|
## Deepseek Stack using Ollama & OpenWebUI
|
|
|
|
### Ollama (Manually)
|
|
Start a server by first
|
|
```
|
|
ollama serve
|
|
```
|
|
(Default port is 11434).
|
|
And then run your model (do just this to test a model.)
|
|
```
|
|
ollama run deepseek-coder
|
|
```
|
|
Stop the server by stopping the ollama service:
|
|
```
|
|
systemctl stop ollama
|
|
```
|
|
|
|
### Or just run everything with Docker
|
|
```
|
|
docker compose down && docker system prune -af && docker compose build && docker compose up -d && docker logs -f openwebui
|
|
```
|
|
|
|
#### Notes on first time setup
|
|
1. Create admin account
|
|
2. Wait a moment for the UI to load, sometimes takes a long time (like legitimately 10 minutes) with my card.
|
|
|