Files
computer-playbook/roles/web-app-flowise/vars/main.yml
Kevin Veen-Birkenbach 5d1210d651 feat(ai): introduce dedicated AI roles and wiring; clean up legacy AI stack
• Add svc-ai category under roles and load it in constructor stage

• Create new 'svc-ai-ollama' role (vars, tasks, compose, meta, README) and dedicated network

• Refactor former AI stack into separate app roles: web-app-flowise and web-app-openwebui

• Add web-app-minio role; adjust config (no central DB), meta (fa-database, run_after), compose networks include, volume key

• Provide user-focused READMEs for Flowise, OpenWebUI, MinIO, Ollama

• Networks: add subnets for web-app-openwebui, web-app-flowise, web-app-minio; rename web-app-ai → svc-ai-ollama

• Ports: rename ai_* keys to web-app-openwebui / web-app-flowise; keep minio_api/minio_console

• Add group_vars/all/17_ai.yml (OLLAMA_BASE_LOCAL_URL, OLLAMA_LOCAL_ENABLED)

• Replace hardcoded include paths with path_join in multiple roles (svc-db-postgres, sys-service, sys-stk-front-proxy, sys-stk-full-stateful, sys-svc-webserver, web-svc-cdn, web-app-keycloak)

• Remove obsolete web-app-ai templates/vars/env; split Flowise into its own role

• Minor config cleanups (CSP flags to {}, central_database=false)

https://chatgpt.com/share/68d15cb8-cf18-800f-b853-78962f751f81
2025-09-22 18:40:20 +02:00

34 lines
2.1 KiB
YAML

# General
application_id: "web-app-flowise"
# Flowise
# https://flowiseai.com/
FLOWISE_VERSION: "{{ applications | get_app_conf(application_id, 'docker.services.flowise.version') }}"
FLOWISE_IMAGE: "{{ applications | get_app_conf(application_id, 'docker.services.flowise.image') }}"
FLOWISE_CONTAINER: "{{ applications | get_app_conf(application_id, 'docker.services.flowise.name') }}"
FLOWISE_VOLUME: "{{ applications | get_app_conf(application_id, 'docker.volumes.flowise') }}"
FLOWISE_PORT_PUBLIC: "{{ ports.localhost.http[application_id] }}"
FLOWISE_PORT_INTERNAL: 3000
# Dependencies
## LiteLLM
# https://www.litellm.ai/
FLOWISE_LITELLM_VERSION: "{{ applications | get_app_conf(application_id, 'docker.services.litellm.version') }}"
FLOWISE_LITELLM_IMAGE: "{{ applications | get_app_conf(application_id, 'docker.services.litellm.image') }}"
FLOWISE_LITELLM_CONTAINER: "{{ applications | get_app_conf(application_id, 'docker.services.litellm.name') }}"
FLOWISE_LITELLM_PORT: 4000
FLOWISE_LITELLM_INTERNAL_URL: "http://litellm:{{ FLOWISE_LITELLM_PORT }}"
FLOWISE_LITELLM_CONFIG_PATH_HOST: "{{ [ docker_compose.directories.config, 'litellm.config.yaml' ] | path_join }}"
FLOWISE_LITELLM_CONFIG_PATH_DOCKER: "/etc/litellm/config.yaml"
## Qdrant
# https://qdrant.tech/
FLOWISE_QDRANT_VERSION: "{{ applications | get_app_conf(application_id, 'docker.services.qdrant.version') }}"
FLOWISE_QDRANT_IMAGE: "{{ applications | get_app_conf(application_id, 'docker.services.qdrant.image') }}"
FLOWISE_QDRANT_CONTAINER: "{{ applications | get_app_conf(application_id, 'docker.services.qdrant.name') }}"
FLOWISE_QDRANT_VOLUME: "{{ applications | get_app_conf(application_id, 'docker.volumes.qdrant') }}"
FLOWISE_QDRANT_HTTP_PORT: 6333
FLOWISE_QDRANT_GRPC_PORT: 6334
FLOWISE_QDRANT_INTERNAL_URL: "http://qdrant:{{ FLOWISE_QDRANT_HTTP_PORT }}"