mirror of
https://github.com/kevinveenbirkenbach/computer-playbook.git
synced 2025-09-24 19:16:26 +02:00
• Add svc-ai category under roles and load it in constructor stage • Create new 'svc-ai-ollama' role (vars, tasks, compose, meta, README) and dedicated network • Refactor former AI stack into separate app roles: web-app-flowise and web-app-openwebui • Add web-app-minio role; adjust config (no central DB), meta (fa-database, run_after), compose networks include, volume key • Provide user-focused READMEs for Flowise, OpenWebUI, MinIO, Ollama • Networks: add subnets for web-app-openwebui, web-app-flowise, web-app-minio; rename web-app-ai → svc-ai-ollama • Ports: rename ai_* keys to web-app-openwebui / web-app-flowise; keep minio_api/minio_console • Add group_vars/all/17_ai.yml (OLLAMA_BASE_LOCAL_URL, OLLAMA_LOCAL_ENABLED) • Replace hardcoded include paths with path_join in multiple roles (svc-db-postgres, sys-service, sys-stk-front-proxy, sys-stk-full-stateful, sys-svc-webserver, web-svc-cdn, web-app-keycloak) • Remove obsolete web-app-ai templates/vars/env; split Flowise into its own role • Minor config cleanups (CSP flags to {}, central_database=false) https://chatgpt.com/share/68d15cb8-cf18-800f-b853-78962f751f81
25 lines
1.0 KiB
Markdown
25 lines
1.0 KiB
Markdown
# Flowise
|
|
|
|
## Description
|
|
|
|
**Flowise** is a visual builder for AI workflows. Create, test, and publish chains that combine LLMs, your documents, tools, and vector search—without writing glue code.
|
|
|
|
## Overview
|
|
|
|
Users design flows on a drag-and-drop canvas (LLM, RAG, tools, webhooks), test them interactively, and publish endpoints that applications or bots can call. Flowise works well with local backends such as **Ollama** (directly or via **LiteLLM**) and **Qdrant** for retrieval.
|
|
|
|
## Features
|
|
|
|
* No/low-code canvas to build assistants and pipelines
|
|
* Publish flows as HTTP endpoints for easy integration
|
|
* Retrieval-augmented generation (RAG) with vector DBs (e.g., Qdrant)
|
|
* Pluggable model backends via OpenAI-compatible API or direct Ollama
|
|
* Keep data and prompts on your own infrastructure
|
|
|
|
## Further Resources
|
|
|
|
* Flowise — [https://flowiseai.com](https://flowiseai.com)
|
|
* Qdrant — [https://qdrant.tech](https://qdrant.tech)
|
|
* LiteLLM — [https://www.litellm.ai](https://www.litellm.ai)
|
|
* Ollama — [https://ollama.com](https://ollama.com)
|