mirror of
https://github.com/kevinveenbirkenbach/computer-playbook.git
synced 2025-09-24 11:06:24 +02:00
feat(ai): introduce dedicated AI roles and wiring; clean up legacy AI stack
• Add svc-ai category under roles and load it in constructor stage • Create new 'svc-ai-ollama' role (vars, tasks, compose, meta, README) and dedicated network • Refactor former AI stack into separate app roles: web-app-flowise and web-app-openwebui • Add web-app-minio role; adjust config (no central DB), meta (fa-database, run_after), compose networks include, volume key • Provide user-focused READMEs for Flowise, OpenWebUI, MinIO, Ollama • Networks: add subnets for web-app-openwebui, web-app-flowise, web-app-minio; rename web-app-ai → svc-ai-ollama • Ports: rename ai_* keys to web-app-openwebui / web-app-flowise; keep minio_api/minio_console • Add group_vars/all/17_ai.yml (OLLAMA_BASE_LOCAL_URL, OLLAMA_LOCAL_ENABLED) • Replace hardcoded include paths with path_join in multiple roles (svc-db-postgres, sys-service, sys-stk-front-proxy, sys-stk-full-stateful, sys-svc-webserver, web-svc-cdn, web-app-keycloak) • Remove obsolete web-app-ai templates/vars/env; split Flowise into its own role • Minor config cleanups (CSP flags to {}, central_database=false) https://chatgpt.com/share/68d15cb8-cf18-800f-b853-78962f751f81
This commit is contained in:
@@ -86,4 +86,4 @@ _applications_nextcloud_oidc_flavor: >-
|
||||
RBAC:
|
||||
GROUP:
|
||||
NAME: "/roles" # Name of the group which holds the RBAC roles
|
||||
CLAIM: "groups" # Name of the claim containing the RBAC groups
|
||||
CLAIM: "groups" # Name of the claim containing the RBAC groups
|
||||
|
@@ -104,6 +104,12 @@ defaults_networks:
|
||||
subnet: 192.168.103.224/28
|
||||
web-app-xwiki:
|
||||
subnet: 192.168.103.240/28
|
||||
web-app-openwebui:
|
||||
subnet: 192.168.104.0/28
|
||||
web-app-flowise:
|
||||
subnet: 192.168.104.16/28
|
||||
web-app-minio:
|
||||
subnet: 192.168.104.32/28
|
||||
|
||||
# /24 Networks / 254 Usable Clients
|
||||
web-app-bigbluebutton:
|
||||
@@ -116,5 +122,5 @@ defaults_networks:
|
||||
subnet: 192.168.201.0/24
|
||||
svc-db-openldap:
|
||||
subnet: 192.168.202.0/24
|
||||
web-app-ai:
|
||||
svc-ai-ollama:
|
||||
subnet: 192.168.203.0/24 # Big network to bridge applications into ai
|
||||
|
@@ -76,8 +76,8 @@ ports:
|
||||
web-app-magento: 8052
|
||||
web-app-bridgy-fed: 8053
|
||||
web-app-xwiki: 8054
|
||||
web-app-ai_openwebui: 8055
|
||||
web-app-ai_flowise: 8056
|
||||
web-app-openwebui: 8055
|
||||
web-app-flowise: 8056
|
||||
web-app-minio_api: 8057
|
||||
web-app-minio_console: 8058
|
||||
web-app-bigbluebutton: 48087 # This port is predefined by bbb. @todo Try to change this to a 8XXX port
|
||||
|
3
group_vars/all/17_ai.yml
Normal file
3
group_vars/all/17_ai.yml
Normal file
@@ -0,0 +1,3 @@
|
||||
# URL of Local Ollama Container
|
||||
OLLAMA_BASE_LOCAL_URL: "http://{{ applications | get_app_conf('svc-ai-ollama', 'docker.services.ollama.name') }}:{{ applications | get_app_conf(application_id, 'docker.services.ollama.port') }}"
|
||||
OLLAMA_LOCAL_ENABLED: "{{ applications | get_app_conf(application_id, 'server.domains.canonical.flowise') }}"
|
Reference in New Issue
Block a user