Open WebUI OIDC & proxy fixes + Ollama preload + async-safe pull

- svc-ai-ollama:
  - Add preload_models (llama3, mistral, nomic-embed-text)
  - Pre-pull task: loop_var=model, async-safe changed_when/failed_when

- sys-svc-proxy (OpenResty):
  - Forward Authorization header
  - Ensure proxy_pass_request_headers on

- web-app-openwebui:
  - ADMIN_EMAIL from users.administrator.email
  - Request RBAC group scope in OAUTH_SCOPES

Ref: ChatGPT support (2025-09-23) — https://chatgpt.com/share/68d20588-2584-800f-aed4-26ce710c69c4
This commit is contained in:
2025-09-23 04:27:46 +02:00
parent 1b91ddeac2
commit f4cf55b3c8
5 changed files with 30 additions and 3 deletions

View File

@@ -16,4 +16,23 @@
vars:
docker_compose_flush_handlers: true
- name: Pre-pull Ollama models
vars:
_cmd: "docker exec -i {{ OLLAMA_CONTAINER }} ollama pull {{ model }}"
shell: "{{ _cmd }}"
register: pull_result
loop: "{{ OLLAMA_PRELOAD_MODELS }}"
loop_control:
loop_var: model
async: "{{ ASYNC_TIME if ASYNC_ENABLED | bool else omit }}"
poll: "{{ ASYNC_POLL if ASYNC_ENABLED | bool else omit }}"
changed_when: >
(not (ASYNC_ENABLED | bool)) and (
'downloaded' in (pull_result.stdout | default('')) or
'pulling manifest' in (pull_result.stdout | default(''))
)
failed_when: >
(pull_result.rc | default(0)) != 0 and
('up to date' not in (pull_result.stdout | default('')))
- include_tasks: utils/run_once.yml