Files
computer-playbook/roles/web-app-flowise
Kevin Veen-Birkenbach 5d1210d651 feat(ai): introduce dedicated AI roles and wiring; clean up legacy AI stack
• Add svc-ai category under roles and load it in constructor stage

• Create new 'svc-ai-ollama' role (vars, tasks, compose, meta, README) and dedicated network

• Refactor former AI stack into separate app roles: web-app-flowise and web-app-openwebui

• Add web-app-minio role; adjust config (no central DB), meta (fa-database, run_after), compose networks include, volume key

• Provide user-focused READMEs for Flowise, OpenWebUI, MinIO, Ollama

• Networks: add subnets for web-app-openwebui, web-app-flowise, web-app-minio; rename web-app-ai → svc-ai-ollama

• Ports: rename ai_* keys to web-app-openwebui / web-app-flowise; keep minio_api/minio_console

• Add group_vars/all/17_ai.yml (OLLAMA_BASE_LOCAL_URL, OLLAMA_LOCAL_ENABLED)

• Replace hardcoded include paths with path_join in multiple roles (svc-db-postgres, sys-service, sys-stk-front-proxy, sys-stk-full-stateful, sys-svc-webserver, web-svc-cdn, web-app-keycloak)

• Remove obsolete web-app-ai templates/vars/env; split Flowise into its own role

• Minor config cleanups (CSP flags to {}, central_database=false)

https://chatgpt.com/share/68d15cb8-cf18-800f-b853-78962f751f81
2025-09-22 18:40:20 +02:00
..

Flowise

Description

Flowise is a visual builder for AI workflows. Create, test, and publish chains that combine LLMs, your documents, tools, and vector search—without writing glue code.

Overview

Users design flows on a drag-and-drop canvas (LLM, RAG, tools, webhooks), test them interactively, and publish endpoints that applications or bots can call. Flowise works well with local backends such as Ollama (directly or via LiteLLM) and Qdrant for retrieval.

Features

  • No/low-code canvas to build assistants and pipelines
  • Publish flows as HTTP endpoints for easy integration
  • Retrieval-augmented generation (RAG) with vector DBs (e.g., Qdrant)
  • Pluggable model backends via OpenAI-compatible API or direct Ollama
  • Keep data and prompts on your own infrastructure

Further Resources