Overall Status: π’ STABLE
| Property | Value |
|---|---|
| Hostname | judy |
| OS | Linux (systemd-based) |
| CPU | Ryzen 5 5500 |
| RAM | 64 GB |
| GPU | RTX 3080 Ti 12 GB |
| Role | Local AI + WebUI Server |
| Property | Value |
|---|---|
| LAN IP | 192.168.5.81 |
| Intended Exposure | LAN only |
| Public Exposure | Disabled |
| Property | Value |
|---|---|
| Installed | Yes |
| Service Manager | systemd |
| Status | Running |
Useful commands:
# Check running models
ollama ps
# List installed models
ollama list
Model storage paths (default):
~/.ollama/models/usr/share/ollama/.ollama/models| Model | ID | Size | Last Modified |
|---|---|---|---|
x/z-image-turbo:fp8 |
1053737ea587 |
12 GB | Recent |
qwen2.5-coder:latest |
dae161e27b0e |
4.7 GB | 11 hours ago |
nemotron-3-nano:latest |
b725f1117407 |
24 GB | 7 weeks ago |
Last observed running: llama3:latest β orphaned process, model file already deleted. Self-clears on timeout.
| Property | Value |
|---|---|
| Service Name | openwebui.service |
| Status | Running |
| Port | 8080 |
| Bind Address | 127.0.0.1 (change to 0.0.0.0 for LAN access) |
Installation paths:
| Component | Path |
|---|---|
| Root | /opt/openwebui2/ |
| Virtualenv | /opt/openwebui2/venv |
| Python | /opt/openwebui2/venv/bin/python |
| Binary | /opt/openwebui2/venv/bin/open-webui |
Launch command:
/opt/openwebui2/venv/bin/python /opt/openwebui2/venv/bin/open-webui serve --host 127.0.0.1 --port 8080
Configured via OpenWebUI system prompt. Trigger each persona by name in chat.
| Persona | Trigger | Role |
|---|---|---|
| Deadpool | Hey Deadpool | Builder |
| Professor | Hey Professor | Teacher |
| Shuri | Hey Shuri | Reviewer |
| Vision | Hey Vision | Architect |
| Maria | Hey Maria | Executor |
Client Browser
β
OpenWebUI :8080
β
Ollama Runtime
β
Local Models
β
GPU (RTX 3080 Ti)
| Area | Status |
|---|---|
| LAN Access | Planned |
| Firewall | Unknown |
| Public Internet | Disabled |
Check firewall status:
Overall Status: π’ STABLE
| Property | Value |
|---|---|
| Hostname | judy |
| OS | Linux (systemd-based) |
| CPU | Ryzen 5 5500 |
| RAM | 64 GB |
| GPU | RTX 3080 Ti 12 GB |
| Role | Local AI + WebUI Server |
| Property | Value |
|---|---|
| LAN IP | 192.168.5.81 |
| Intended Exposure | LAN only |
| Public Exposure | Disabled |
ssh jen@192.168.5.81
In PuTTY: Host Name =
192.168.5.81, Port =22, Connection type = SSH
| Property | Value |
|---|---|
| Installed | Yes |
| Service Manager | systemd |
| Status | Running |
Useful commands:
# Check running models
ollama ps
# List installed models
ollama list
Model storage paths (default):
~/.ollama/models/usr/share/ollama/.ollama/models| Model | ID | Size | Last Modified |
|---|---|---|---|
x/z-image-turbo:fp8 |
1053737ea587 |
12 GB | Recent |
qwen2.5-coder:latest |
dae161e27b0e |
4.7 GB | 11 hours ago |
nemotron-3-nano:latest |
b725f1117407 |
24 GB | 7 weeks ago |
Last observed running: llama3:latest β orphaned process, model file already deleted. Self-clears on timeout.
| Property | Value |
|---|---|
| Service Name | openwebui.service |
| Status | Running |
| Port | 8080 |
| Bind Address | 127.0.0.1 (change to 0.0.0.0 for LAN access) |
Installation paths:
| Component | Path |
|---|---|
| Root | /opt/openwebui2/ |
| Virtualenv | /opt/openwebui2/venv |
| Python | /opt/openwebui2/venv/bin/python |
| Binary | /opt/openwebui2/venv/bin/open-webui |
Launch command:
/opt/openwebui2/venv/bin/python /opt/openwebui2/venv/bin/open-webui serve --host 127.0.0.1 --port 8080
Configured via OpenWebUI system prompt. Trigger each persona by name in chat.
| Persona | Trigger | Role |
|---|---|---|
| Deadpool | Hey Deadpool | Builder |
| Professor | Hey Professor | Teacher |
| Shuri | Hey Shuri | Reviewer |
| Vision | Hey Vision | Architect |
| Maria | Hey Maria | Executor |
Client Browser
β
OpenWebUI :8080
β
Ollama Runtime
β
Local Models
β
GPU (RTX 3080 Ti)
| Area | Status |
|---|---|
| LAN Access | Planned |
| Firewall | Unknown |
| Public Internet | Disabled |
Check firewall status:
sudo ufw status
Check active services:
systemctl list-units --type=service | grep -E 'ollama|webui'
Expected: openwebui.service, ollama.service
192.168.5.81llama3:latest active on GPUllama3:latest β orphaned process, model file deleted, will self-clearsudo ufw status)0.0.0.0 for LAN accessjudy.local or ai.lab)