mirror of
https://github.com/kevinveenbirkenbach/computer-playbook.git
synced 2025-09-24 11:06:24 +02:00
- svc-ai-ollama: - Add preload_models (llama3, mistral, nomic-embed-text) - Pre-pull task: loop_var=model, async-safe changed_when/failed_when - sys-svc-proxy (OpenResty): - Forward Authorization header - Ensure proxy_pass_request_headers on - web-app-openwebui: - ADMIN_EMAIL from users.administrator.email - Request RBAC group scope in OAUTH_SCOPES Ref: ChatGPT support (2025-09-23) — https://chatgpt.com/share/68d20588-2584-800f-aed4-26ce710c69c4
Here are user-focused README.md drafts for the four roles, following your template structure and describing the role (what the installed software does for users), not the folder.
Open WebUI
Description
Open WebUI provides a clean, fast chat interface for working with local AI models (e.g., via Ollama). It delivers a ChatGPT-like experience on your own infrastructure to keep prompts and data private.
Overview
End users access a web page, pick a model, and start chatting. Conversations remain on your servers. Admins can enable strict offline behavior so no external network calls occur. The UI can also point at OpenAI-compatible endpoints if needed.
Features
- Familiar multi-chat interface with quick model switching
- Supports local backends (Ollama) and OpenAI-compatible APIs
- Optional offline mode for air-gapped environments
- File/paste input for summaries and extraction (model dependent)
- Suitable for teams: predictable, private, reproducible
Further Resources
- Open WebUI — https://openwebui.com
- Ollama — https://ollama.com