mirror of
				https://github.com/kevinveenbirkenbach/computer-playbook.git
				synced 2025-11-04 04:08:15 +00:00 
			
		
		
		
	• Add svc-ai category under roles and load it in constructor stage
• Create new 'svc-ai-ollama' role (vars, tasks, compose, meta, README) and dedicated network
• Refactor former AI stack into separate app roles: web-app-flowise and web-app-openwebui
• Add web-app-minio role; adjust config (no central DB), meta (fa-database, run_after), compose networks include, volume key
• Provide user-focused READMEs for Flowise, OpenWebUI, MinIO, Ollama
• Networks: add subnets for web-app-openwebui, web-app-flowise, web-app-minio; rename web-app-ai → svc-ai-ollama
• Ports: rename ai_* keys to web-app-openwebui / web-app-flowise; keep minio_api/minio_console
• Add group_vars/all/17_ai.yml (OLLAMA_BASE_LOCAL_URL, OLLAMA_LOCAL_ENABLED)
• Replace hardcoded include paths with path_join in multiple roles (svc-db-postgres, sys-service, sys-stk-front-proxy, sys-stk-full-stateful, sys-svc-webserver, web-svc-cdn, web-app-keycloak)
• Remove obsolete web-app-ai templates/vars/env; split Flowise into its own role
• Minor config cleanups (CSP flags to {}, central_database=false)
https://chatgpt.com/share/68d15cb8-cf18-800f-b853-78962f751f81
		
	
		
			
				
	
	
		
			26 lines
		
	
	
		
			695 B
		
	
	
	
		
			YAML
		
	
	
	
	
	
			
		
		
	
	
			26 lines
		
	
	
		
			695 B
		
	
	
	
		
			YAML
		
	
	
	
	
	
---
 | 
						|
galaxy_info:
 | 
						|
  author: "Kevin Veen-Birkenbach"
 | 
						|
  description: "Installs Ollama — a local model server for running open LLMs with a simple HTTP API."
 | 
						|
  license: "Infinito.Nexus NonCommercial License"
 | 
						|
  license_url: "https://s.infinito.nexus/license"
 | 
						|
  company: | 
 | 
						|
    Kevin Veen-Birkenbach
 | 
						|
    Consulting & Coaching Solutions
 | 
						|
    https://www.veen.world
 | 
						|
  galaxy_tags:
 | 
						|
    - ai
 | 
						|
    - llm
 | 
						|
    - inference
 | 
						|
    - offline
 | 
						|
    - privacy
 | 
						|
    - self-hosted
 | 
						|
    - ollama
 | 
						|
  repository: "https://s.infinito.nexus/code"
 | 
						|
  issue_tracker_url: "https://s.infinito.nexus/issues"
 | 
						|
  documentation: "https://s.infinito.nexus/code/"
 | 
						|
  logo:
 | 
						|
    class: "fa-solid fa-microchip"
 | 
						|
  run_after: [] 
 | 
						|
dependencies: []
 |