mirror of
https://github.com/kevinveenbirkenbach/computer-playbook.git
synced 2025-10-09 18:28:10 +02:00
- Added _normalize_codes to support lists of valid HTTP status codes - Updated web_health_expectations to handle multiple codes, deduplication, and fallback logic - Extended unit tests with coverage for list/default combinations, invalid values, and alias behavior - Fixed Flowise CSP flags and whitelist entries - Adjusted Flowise, MinIO, and Pretix docker service resource limits - Updated docker-compose templates with explicit service_name - Corrected MinIO status_codes to 301 redirects ✅ All CSP errors fixed See details: https://chatgpt.com/share/68d557ad-fc10-800f-b68b-0411d20ea6eb
Flowise
Description
Flowise is a visual builder for AI workflows. Create, test, and publish chains that combine LLMs, your documents, tools, and vector search—without writing glue code.
Overview
Users design flows on a drag-and-drop canvas (LLM, RAG, tools, webhooks), test them interactively, and publish endpoints that applications or bots can call. Flowise works well with local backends such as Ollama (directly or via LiteLLM) and Qdrant for retrieval.
Features
- No/low-code canvas to build assistants and pipelines
- Publish flows as HTTP endpoints for easy integration
- Retrieval-augmented generation (RAG) with vector DBs (e.g., Qdrant)
- Pluggable model backends via OpenAI-compatible API or direct Ollama
- Keep data and prompts on your own infrastructure
Further Resources
- Flowise — https://flowiseai.com
- Qdrant — https://qdrant.tech
- LiteLLM — https://www.litellm.ai
- Ollama — https://ollama.com