mirror of
https://github.com/kevinveenbirkenbach/computer-playbook.git
synced 2025-12-10 11:26:24 +00:00
This commit updates multiple roles to ensure compatibility with Ansible 2.20. Several include paths and task-loading mechanisms required adjustments, as Ansible 2.20 applies stricter evaluation rules for complex Jinja expressions and no longer resolves certain relative include paths the way Ansible 2.18 did. Key changes: - Replaced legacy once_finalize.yml and once_flag.yml with the new structure under tasks/utils/once/finalize.yml and tasks/utils/once/flag.yml. - Updated all include_tasks statements to use 'path_join' with playbook_dir, ensuring deterministic and absolute file resolution across roles. - Fixed all network helper includes by converting direct relative paths such as 'roles/docker-compose/tasks/utils/network.yml' to proper Jinja-evaluated paths. - Normalized MATOMO_* variable names for consistency with the updated variable scope behavior in Ansible 2.20. - Removed deprecated patterns that were implicitly supported in Ansible 2.18 but break under the more strict variable and path resolution model in 2.20. These changes are part of the full migration step required to ensure the infinito-nexus roles remain stable, deterministic, and forward-compatible with Ansible 2.20. Details of the discussion and reasoning can be found in this conversation: https://chatgpt.com/share/69300a8d-24d4-800f-bec0-e895a695618a
Ollama
Description
Ollama is a local model server that runs open LLMs on your hardware and exposes a simple HTTP API. It’s the backbone for privacy-first AI: prompts and data stay on your machines.
Overview
After the first model pull, Ollama serves models to clients like Open WebUI (for chat) and Flowise (for workflows). Models are cached locally for quick reuse and can run fully offline when required.
Features
- Run popular open models (chat, code, embeddings) locally
- Simple, predictable HTTP API for developers
- Local caching to avoid repeated downloads
- Works seamlessly with Open WebUI and Flowise
- Offline-capable for air-gapped deployments
Further Resources
- Ollama — https://ollama.com
- Ollama Model Library — https://ollama.com/library