15 Commits

Author SHA1 Message Date
0e89d89b45 Make sound support optional and guard against missing audio dependencies
- Move simpleaudio to optional dependency (audio extra)
- Add DummySound fallback when optional audio libs are unavailable
- Import simpleaudio/numpy lazily with ImportError handling
- Remove Docker-specific sound disabling logic
- Improve typing and robustness of sound utilities

https://chatgpt.com/share/693dec1d-60bc-800f-8ffe-3886a9c265bd
2025-12-13 23:43:36 +01:00
d0882433c8 Refactor setup workflow and make install robust via virtualenv
- Introduce a dedicated Python virtualenv (deps target) and run all setup scripts through it
- Fix missing PyYAML errors in clean, CI, and Nix environments
- Refactor build defaults into cli/setup for clearer semantics
- Make setup deterministic and independent from system Python
- Replace early Makefile shell expansion with runtime evaluation
- Rename messy-test to test-messy and update deploy logic and tests accordingly
- Keep setup and test targets consistent across Makefile, CLI, and unit tests

https://chatgpt.com/share/693de226-00ac-800f-8cbd-06552b2f283c
2025-12-13 23:00:13 +01:00
600d7a1fe8 Ignored python package build files 2025-12-13 22:13:53 +01:00
0580839705 Makefile: unify Python interpreter via PYTHON variable
Avoids mixed system/Nix/venv Python usage and fixes missing PyYAML errors.

https://chatgpt.com/share/693dd6b2-14f0-800f-9b95-368d58b68f49
2025-12-13 22:12:12 +01:00
7070100363 Added missing PyYAML 2025-12-13 21:41:29 +01:00
ad813df0c5 Switch to pyproject.toml for Python dependencies
Introduce pyproject.toml as the single source of truth for Python dependencies.
Remove legacy requirements.txt and simplify requirements.yml to Ansible collections only.
Drop pytest in favor of the built-in unittest framework.

https://chatgpt.com/share/693dbe8c-8b64-800f-a6e5-41b7d21ae7e0
2025-12-13 20:29:09 +01:00
f8e2aa2b93 Added mirrors 2025-12-13 09:27:04 +01:00
d0a2c3fada Release version 0.2.1 2025-12-10 21:14:47 +01:00
75eaecce5b **Remove obsolete installation/administration docs, fix pgAdmin server mode condition, normalize git repository vars, and ensure correct application_id for web-app-sphinx**
* Remove outdated `Installation.md` and `Administration.md` documentation from Akaunting and Peertube roles
* Fix `server_mode` conditional in `web-app-pgadmin` to avoid unintended defaults
* Normalize formatting of git repository variables in `web-app-roulette-wheel`
* Explicitly set `application_id` when loading `sys-stk-full-stateless` in `web-app-sphinx` to prevent scoping issues

https://chatgpt.com/share/6939d42e-483c-800f-b0fc-be61caab615d
2025-12-10 21:12:15 +01:00
57ec936d30 Release version 0.2.0 2025-12-10 19:30:54 +01:00
f143ce258c dev-nix: migrate to official installer with dynamic SHA256 verification,
split non-Arch logic, add template-based nix.conf, and integrate into pkgmgr

- Replace local installer mechanism with official upstream URLs:
  https://releases.nixos.org/nix/nix-<version>/install
  and dynamically fetch associated SHA256 checksum
- Add version-based URL construction via new defaults variables
- Implement clean OS-branching:
  * Arch-based systems: install Nix via pacman
  * Non-Arch systems: download installer + verify SHA256 + run in daemon mode
- Extract non-Arch installation logic into dedicated task file
  (02_non_arch_installer.yml)
- Introduce template-based /etc/nix/nix.conf with build-users-group
  and optional experimental-features block
- Remove obsolete install.yml
- Update pkgmgr dev stack to include dev-nix and adjust update command
- Add TODO.md for future security improvements

https://chatgpt.com/share/6939bbfe-5cb0-800f-8ea8-95628dc911f5
https://chatgpt.com/share/6939bbd9-4840-800f-b9d2-b2510ea0f105
2025-12-10 19:29:04 +01:00
060ae45c7d Removed test_no_module_redirections_in_log to allow multi distribution support 2025-12-10 19:21:48 +01:00
a3ba40edb6 Release version 0.1.1 2025-12-10 17:55:50 +01:00
f9825ac4fc Fixed pkgmgr ssh pull bug - https://chatgpt.com/share/6939a5ae-b7a0-800f-95ff-2f4107973ca7 2025-12-10 17:55:01 +01:00
9d66910120 Solved tag bug https://chatgpt.com/share/6939a40e-42fc-800f-89a5-6b50113f8056 2025-12-10 17:47:22 +01:00
34 changed files with 468 additions and 404 deletions

2
.gitignore vendored
View File

@@ -10,3 +10,5 @@ venv
*tree.json *tree.json
roles/list.json roles/list.json
*.pyc *.pyc
*.egg-info
build

View File

@@ -1,3 +1,18 @@
## [0.2.1] - 2025-12-10
* restored full deployability of the Sphinx app by fixing the application_id scoping bug.
## [0.2.0] - 2025-12-10
* Added full Nix installer integration with dynamic upstream SHA256 verification, OS-specific installation paths, template-driven configuration, and updated pkgmgr integration.
## [0.1.1] - 2025-12-10
* PKGMGR will now be pulled again
## [0.1.0] - 2025-12-09 ## [0.1.0] - 2025-12-09
* Added Nix support role * Added Nix support role

3
MIRRORS Normal file
View File

@@ -0,0 +1,3 @@
git@github.com:infinito-nexus/core.git
ssh://git@code.infinito.nexus:2201/infinito/nexus.git
git@github.com:kevinveenbirkenbach/infinito-nexus.git

View File

@@ -1,12 +1,14 @@
SHELL := /usr/bin/env bash
VENV ?= .venv
PYTHON := $(VENV)/bin/python
PIP := $(PYTHON) -m pip
ROLES_DIR := ./roles ROLES_DIR := ./roles
APPLICATIONS_OUT := ./group_vars/all/04_applications.yml APPLICATIONS_OUT := ./group_vars/all/04_applications.yml
APPLICATIONS_SCRIPT := ./cli/build/defaults/applications.py APPLICATIONS_SCRIPT := ./cli/setup/applications.py
USERS_SCRIPT := ./cli/setup/users.py
USERS_OUT := ./group_vars/all/03_users.yml USERS_OUT := ./group_vars/all/03_users.yml
USERS_SCRIPT := ./cli/build/defaults/users.py
INCLUDES_SCRIPT := ./cli/build/role_include.py INCLUDES_SCRIPT := ./cli/build/role_include.py
INCLUDE_GROUPS := $(shell python3 main.py meta categories invokable -s "-" --no-signal | tr '\n' ' ')
# Directory where these include-files will be written # Directory where these include-files will be written
INCLUDES_OUT_DIR := ./tasks/groups INCLUDES_OUT_DIR := ./tasks/groups
@@ -19,7 +21,7 @@ RESERVED_USERNAMES := $(shell \
| paste -sd, - \ | paste -sd, - \
) )
.PHONY: build install test .PHONY: deps setup setup-clean test-messy test install
clean-keep-logs: clean-keep-logs:
@echo "🧹 Cleaning ignored files but keeping logs/…" @echo "🧹 Cleaning ignored files but keeping logs/…"
@@ -31,11 +33,11 @@ clean:
list: list:
@echo Generating the roles list @echo Generating the roles list
python3 main.py build roles_list $(PYTHON) main.py build roles_list
tree: tree:
@echo Generating Tree @echo Generating Tree
python3 main.py build tree -D 2 --no-signal $(PYTHON) main.py build tree -D 2 --no-signal
mig: list tree mig: list tree
@echo Creating meta data for meta infinity graph @echo Creating meta data for meta infinity graph
@@ -45,41 +47,51 @@ dockerignore:
cat .gitignore > .dockerignore cat .gitignore > .dockerignore
echo ".git" >> .dockerignore echo ".git" >> .dockerignore
messy-build: dockerignore setup: deps dockerignore
@echo "🔧 Generating users defaults → $(USERS_OUT)" @echo "🔧 Generating users defaults → $(USERS_OUT)"
python3 $(USERS_SCRIPT) \ $(PYTHON) $(USERS_SCRIPT) \
--roles-dir $(ROLES_DIR) \ --roles-dir $(ROLES_DIR) \
--output $(USERS_OUT) \ --output $(USERS_OUT) \
--reserved-usernames "$(RESERVED_USERNAMES)" --reserved-usernames "$(RESERVED_USERNAMES)"
@echo "✅ Users defaults written to $(USERS_OUT)\n" @echo "✅ Users defaults written to $(USERS_OUT)\n"
@echo "🔧 Generating applications defaults → $(APPLICATIONS_OUT)" @echo "🔧 Generating applications defaults → $(APPLICATIONS_OUT)"
python3 $(APPLICATIONS_SCRIPT) \ $(PYTHON) $(APPLICATIONS_SCRIPT) \
--roles-dir $(ROLES_DIR) \ --roles-dir $(ROLES_DIR) \
--output-file $(APPLICATIONS_OUT) --output-file $(APPLICATIONS_OUT)
@echo "✅ Applications defaults written to $(APPLICATIONS_OUT)\n" @echo "✅ Applications defaults written to $(APPLICATIONS_OUT)\n"
@echo "🔧 Generating role-include files for each group…" @echo "🔧 Generating role-include files for each group…"
@mkdir -p $(INCLUDES_OUT_DIR) @mkdir -p $(INCLUDES_OUT_DIR)
@$(foreach grp,$(INCLUDE_GROUPS), \ @INCLUDE_GROUPS="$$( $(PYTHON) main.py meta categories invokable -s "-" --no-signal | tr '\n' ' ' )"; \
out=$(INCLUDES_OUT_DIR)/$(grp)roles.yml; \ for grp in $$INCLUDE_GROUPS; do \
echo "→ Building $$out (pattern: '$(grp)')…"; \ out="$(INCLUDES_OUT_DIR)/$${grp}roles.yml"; \
python3 $(INCLUDES_SCRIPT) $(ROLES_DIR) \ echo "→ Building $$out (pattern: '$$grp')…"; \
-p $(grp) -o $$out; \ $(PYTHON) $(INCLUDES_SCRIPT) $(ROLES_DIR) -p $$grp -o $$out; \
echo "$$out"; \ echo "$$out"; \
) done
messy-test: setup-clean: clean setup
@echo "Full build with cleanup before was executed."
test-messy:
@echo "🧪 Running Python tests…" @echo "🧪 Running Python tests…"
PYTHONPATH=. python -m unittest discover -s tests PYTHONPATH=. $(PYTHON) -m unittest discover -s tests
@echo "📑 Checking Ansible syntax…" @echo "📑 Checking Ansible syntax…"
ansible-playbook -i localhost, -c local $(foreach f,$(wildcard group_vars/all/*.yml),-e @$(f)) playbook.yml --syntax-check ansible-playbook -i localhost, -c local $(foreach f,$(wildcard group_vars/all/*.yml),-e @$(f)) playbook.yml --syntax-check
install: build test: setup-clean test-messy
@echo "⚙️ Install complete." @echo "Full test with setup-clean before was executed."
build: clean messy-build deps:
@echo "Full build with cleanup before was executed." @if [ ! -x "$(PYTHON)" ]; then \
echo "🐍 Creating virtualenv $(VENV)"; \
python3 -m venv $(VENV); \
fi
@echo "📦 Installing Python dependencies"
@$(PIP) install --upgrade pip setuptools wheel
@$(PIP) install -e .
install: deps
@echo "✅ Python environment installed (editable)."
test: build messy-test
@echo "Full test with build before was executed."

View File

@@ -95,8 +95,8 @@ def run_ansible_playbook(
# 4) Test Phase # 4) Test Phase
# --------------------------------------------------------- # ---------------------------------------------------------
if not skip_tests: if not skip_tests:
print("\n🧪 Running tests (make messy-test)...\n") print("\n🧪 Running tests (make test-messy)...\n")
subprocess.run(["make", "messy-test"], check=True) subprocess.run(["make", "test-messy"], check=True)
else: else:
print("\n🧪 Tests skipped (--skip-tests)\n") print("\n🧪 Tests skipped (--skip-tests)\n")

View File

@@ -209,7 +209,7 @@ def print_global_help(available, cli_dir):
Fore.CYAN Fore.CYAN
)) ))
print(color_text( print(color_text(
" corresponds to `cli/build/defaults/users.py`.", " corresponds to `cli/setup/users.py`.",
Fore.CYAN Fore.CYAN
)) ))
print() print()

View File

@@ -1,186 +1,227 @@
import os import os
import warnings
class DummySound: class DummySound:
@staticmethod @staticmethod
def play_start_sound(): pass def play_start_sound() -> None:
pass
@staticmethod @staticmethod
def play_infinito_intro_sound(): pass def play_infinito_intro_sound() -> None:
pass
@staticmethod @staticmethod
def play_finished_successfully_sound(): pass def play_finished_successfully_sound() -> None:
pass
@staticmethod @staticmethod
def play_finished_failed_sound(): pass def play_finished_failed_sound() -> None:
pass
@staticmethod @staticmethod
def play_warning_sound(): pass def play_warning_sound() -> None:
pass
_IN_DOCKER = os.path.exists('/.dockerenv')
if _IN_DOCKER: try:
Sound = DummySound import numpy as np
else: import simpleaudio as sa
try: import shutil
import numpy as np import subprocess
import simpleaudio as sa import tempfile
import shutil, subprocess, tempfile, wave as wavmod import wave as wavmod
class Sound:
"""
Sound effects for the application with enhanced complexity.
Each sound uses at least 6 distinct tones and lasts no more than max_length seconds,
except the intro sound which is a detailed 26-second Berlin techno-style build-up, 12-second celebration with a descending-fifth chord sequence of 7 chords, and breakdown with melodic background.
Transitions between phases now crossfade over 3 seconds for smoother flow.
"""
fs = 44100 # Sampling rate (samples per second) class Sound:
complexity_factor = 10 # Number of harmonics to sum for richer timbres """
max_length = 2.0 # Maximum total duration of any sound in seconds Sound effects for the application.
"""
@staticmethod fs = 44100
def _generate_complex_wave(frequency: float, duration: float, harmonics: int = None) -> np.ndarray: complexity_factor = 10
if harmonics is None: max_length = 2.0
harmonics = Sound.complexity_factor
t = np.linspace(0, duration, int(Sound.fs * duration), False)
wave = np.zeros_like(t)
for n in range(1, harmonics + 1):
wave += (1 / n) * np.sin(2 * np.pi * frequency * n * t)
# ADSR envelope
attack = int(0.02 * Sound.fs)
release = int(0.05 * Sound.fs)
env = np.ones_like(wave)
env[:attack] = np.linspace(0, 1, attack)
env[-release:] = np.linspace(1, 0, release)
wave *= env
wave /= np.max(np.abs(wave))
return (wave * (2**15 - 1)).astype(np.int16)
@staticmethod @staticmethod
def _crossfade(w1: np.ndarray, w2: np.ndarray, fade_len: int) -> np.ndarray: def _generate_complex_wave(
# Ensure fade_len less than each frequency: float,
fade_len = min(fade_len, len(w1), len(w2)) duration: float,
fade_out = np.linspace(1, 0, fade_len) harmonics: int | None = None,
fade_in = np.linspace(0, 1, fade_len) ) -> np.ndarray:
w1_end = w1[-fade_len:] * fade_out if harmonics is None:
w2_start = w2[:fade_len] * fade_in harmonics = Sound.complexity_factor
middle = (w1_end + w2_start).astype(np.int16)
return np.concatenate([w1[:-fade_len], middle, w2[fade_len:]])
@staticmethod t = np.linspace(0, duration, int(Sound.fs * duration), False)
def _play_via_system(wave: np.ndarray): wave = np.zeros_like(t)
# Write a temp WAV and play it via available system player
with tempfile.NamedTemporaryFile(delete=False, suffix=".wav") as f:
fname = f.name
try:
with wavmod.open(fname, "wb") as w:
w.setnchannels(1)
w.setsampwidth(2)
w.setframerate(Sound.fs)
w.writeframes(wave.tobytes())
def run(cmd):
return subprocess.run(
cmd, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
).returncode == 0
# Preferred order: PipeWire → PulseAudio → ALSA → ffplay
if shutil.which("pw-play") and run(["pw-play", fname]): return
if shutil.which("paplay") and run(["paplay", fname]): return
if shutil.which("aplay") and run(["aplay", "-q", fname]): return
if shutil.which("ffplay") and run(["ffplay", "-autoexit", "-nodisp", fname]): return
# Last resort if no system player exists: simpleaudio
play_obj = sa.play_buffer(wave, 1, 2, Sound.fs)
play_obj.wait_done()
finally:
try: os.unlink(fname)
except Exception: pass
@staticmethod for n in range(1, harmonics + 1):
def _play(wave: np.ndarray): wave += (1 / n) * np.sin(2 * np.pi * frequency * n * t)
# Switch via env: system | simpleaudio | auto (default)
backend = os.getenv("INFINITO_AUDIO_BACKEND", "auto").lower() # ADSR envelope
if backend == "system": attack = int(0.02 * Sound.fs)
return Sound._play_via_system(wave) release = int(0.05 * Sound.fs)
if backend == "simpleaudio": env = np.ones_like(wave)
play_obj = sa.play_buffer(wave, 1, 2, Sound.fs) env[:attack] = np.linspace(0, 1, attack)
play_obj.wait_done() env[-release:] = np.linspace(1, 0, release)
wave *= env
wave /= np.max(np.abs(wave))
return (wave * (2**15 - 1)).astype(np.int16)
@staticmethod
def _crossfade(w1: np.ndarray, w2: np.ndarray, fade_len: int) -> np.ndarray:
fade_len = min(fade_len, len(w1), len(w2))
if fade_len <= 0:
return np.concatenate([w1, w2])
fade_out = np.linspace(1, 0, fade_len)
fade_in = np.linspace(0, 1, fade_len)
w1_end = w1[-fade_len:].astype(np.float32) * fade_out
w2_start = w2[:fade_len].astype(np.float32) * fade_in
middle = (w1_end + w2_start).astype(np.int16)
return np.concatenate([w1[:-fade_len], middle, w2[fade_len:]])
@staticmethod
def _play_via_system(wave: np.ndarray) -> None:
with tempfile.NamedTemporaryFile(delete=False, suffix=".wav") as f:
fname = f.name
try:
with wavmod.open(fname, "wb") as w:
w.setnchannels(1)
w.setsampwidth(2)
w.setframerate(Sound.fs)
w.writeframes(wave.tobytes())
def run(cmd: list[str]) -> bool:
return (
subprocess.run(
cmd,
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
check=False,
).returncode
== 0
)
if shutil.which("pw-play") and run(["pw-play", fname]):
return return
# auto: try simpleaudio first; if it fails, fall back to system if shutil.which("paplay") and run(["paplay", fname]):
return
if shutil.which("aplay") and run(["aplay", "-q", fname]):
return
if shutil.which("ffplay") and run(["ffplay", "-autoexit", "-nodisp", fname]):
return
play_obj = sa.play_buffer(wave, 1, 2, Sound.fs)
play_obj.wait_done()
finally:
try: try:
play_obj = sa.play_buffer(wave, 1, 2, Sound.fs) os.unlink(fname)
play_obj.wait_done()
except Exception: except Exception:
Sound._play_via_system(wave) pass
@classmethod @staticmethod
def play_infinito_intro_sound(cls): def _play(wave: np.ndarray) -> None:
# Phase durations backend = os.getenv("INFINITO_AUDIO_BACKEND", "auto").lower()
build_time = 10.0
celebr_time = 12.0
breakdown_time = 10.0
overlap = 3.0 # seconds of crossfade
bass_seg = 0.125 # 1/8s kick
melody_seg = 0.25 # 2/8s melody
bass_freq = 65.41 # C2 kick
melody_freqs = [261.63, 293.66, 329.63, 392.00, 440.00, 523.25]
# Build-up phase if backend == "system":
steps = int(build_time / (bass_seg + melody_seg)) Sound._play_via_system(wave)
build_seq = [] return
for i in range(steps):
amp = (i + 1) / steps
b = cls._generate_complex_wave(bass_freq, bass_seg).astype(np.float32) * amp
m = cls._generate_complex_wave(melody_freqs[i % len(melody_freqs)], melody_seg).astype(np.float32) * amp
build_seq.append(b.astype(np.int16))
build_seq.append(m.astype(np.int16))
build_wave = np.concatenate(build_seq)
# Celebration phase: 7 descending-fifth chords if backend == "simpleaudio":
roots = [523.25, 349.23, 233.08, 155.56, 103.83, 69.30, 46.25] play_obj = sa.play_buffer(wave, 1, 2, Sound.fs)
chord_time = celebr_time / len(roots) play_obj.wait_done()
celebr_seq = [] return
for root in roots:
t = np.linspace(0, chord_time, int(cls.fs * chord_time), False)
chord = sum(np.sin(2 * np.pi * f * t) for f in [root, root * 5/4, root * 3/2])
chord /= np.max(np.abs(chord))
celebr_seq.append((chord * (2**15 - 1)).astype(np.int16))
celebr_wave = np.concatenate(celebr_seq)
# Breakdown phase (mirror of build-up) # auto
breakdown_wave = np.concatenate(list(reversed(build_seq))) try:
play_obj = sa.play_buffer(wave, 1, 2, Sound.fs)
play_obj.wait_done()
except Exception:
Sound._play_via_system(wave)
# Crossfade transitions @classmethod
fade_samples = int(overlap * cls.fs) def play_infinito_intro_sound(cls) -> None:
bc = cls._crossfade(build_wave, celebr_wave, fade_samples) build_time = 10.0
full = cls._crossfade(bc, breakdown_wave, fade_samples) celebr_time = 12.0
breakdown_time = 10.0
overlap = 3.0
cls._play(full) bass_seg = 0.125
melody_seg = 0.25
bass_freq = 65.41
melody_freqs = [261.63, 293.66, 329.63, 392.00, 440.00, 523.25]
@classmethod steps = int(build_time / (bass_seg + melody_seg))
def play_start_sound(cls): build_seq: list[np.ndarray] = []
freqs = [523.25, 659.26, 783.99, 880.00, 1046.50, 1174.66]
cls._prepare_and_play(freqs)
@classmethod for i in range(steps):
def play_finished_successfully_sound(cls): amp = (i + 1) / steps
freqs = [523.25, 587.33, 659.26, 783.99, 880.00, 987.77] b = cls._generate_complex_wave(bass_freq, bass_seg).astype(np.float32) * amp
cls._prepare_and_play(freqs) m = cls._generate_complex_wave(
melody_freqs[i % len(melody_freqs)], melody_seg
).astype(np.float32) * amp
build_seq.append(b.astype(np.int16))
build_seq.append(m.astype(np.int16))
@classmethod build_wave = np.concatenate(build_seq)
def play_finished_failed_sound(cls):
freqs = [880.00, 830.61, 783.99, 659.26, 622.25, 523.25]
durations = [0.4, 0.3, 0.25, 0.25, 0.25, 0.25]
cls._prepare_and_play(freqs, durations)
@classmethod roots = [523.25, 349.23, 233.08, 155.56, 103.83, 69.30, 46.25]
def play_warning_sound(cls): chord_time = celebr_time / len(roots)
freqs = [700.00, 550.00, 750.00, 500.00, 800.00, 450.00] celebr_seq: list[np.ndarray] = []
cls._prepare_and_play(freqs)
@classmethod for root in roots:
def _prepare_and_play(cls, freqs, durations=None): t = np.linspace(0, chord_time, int(cls.fs * chord_time), False)
count = len(freqs) chord = sum(np.sin(2 * np.pi * f * t) for f in [root, root * 5 / 4, root * 3 / 2])
if durations is None: chord /= np.max(np.abs(chord))
durations = [cls.max_length / count] * count celebr_seq.append((chord * (2**15 - 1)).astype(np.int16))
else:
total = sum(durations) celebr_wave = np.concatenate(celebr_seq)
durations = [d * cls.max_length / total for d in durations] breakdown_wave = np.concatenate(list(reversed(build_seq)))
waves = [cls._generate_complex_wave(f, d) for f, d in zip(freqs, durations)]
cls._play(np.concatenate(waves)) fade_samples = int(overlap * cls.fs)
except Exception: bc = cls._crossfade(build_wave, celebr_wave, fade_samples)
warnings.warn("Sound support disabled: numpy or simpleaudio could not be imported", RuntimeWarning) full = cls._crossfade(bc, breakdown_wave, fade_samples)
Sound = DummySound
cls._play(full)
@classmethod
def play_start_sound(cls) -> None:
freqs = [523.25, 659.26, 783.99, 880.00, 1046.50, 1174.66]
cls._prepare_and_play(freqs)
@classmethod
def play_finished_successfully_sound(cls) -> None:
freqs = [523.25, 587.33, 659.26, 783.99, 880.00, 987.77]
cls._prepare_and_play(freqs)
@classmethod
def play_finished_failed_sound(cls) -> None:
freqs = [880.00, 830.61, 783.99, 659.26, 622.25, 523.25]
durations = [0.4, 0.3, 0.25, 0.25, 0.25, 0.25]
cls._prepare_and_play(freqs, durations)
@classmethod
def play_warning_sound(cls) -> None:
freqs = [700.00, 550.00, 750.00, 500.00, 800.00, 450.00]
cls._prepare_and_play(freqs)
@classmethod
def _prepare_and_play(cls, freqs: list[float], durations: list[float] | None = None) -> None:
count = len(freqs)
if durations is None:
durations = [cls.max_length / count] * count
else:
total = sum(durations)
durations = [d * cls.max_length / total for d in durations]
waves = [cls._generate_complex_wave(f, d) for f, d in zip(freqs, durations)]
cls._play(np.concatenate(waves))
except ImportError as exc:
warnings.warn(f"Sound support disabled: {exc}", RuntimeWarning)
Sound = DummySound

48
pyproject.toml Normal file
View File

@@ -0,0 +1,48 @@
[build-system]
requires = ["setuptools>=68", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "infinito-nexus"
version = "0.0.0"
description = "Infinito.Nexus"
readme = "README.md"
requires-python = ">=3.10"
license = { file = "LICENSE.md" }
dependencies = [
"numpy",
"ansible",
"colorscheme-generator @ https://github.com/kevinveenbirkenbach/colorscheme-generator/archive/refs/tags/v0.3.0.zip",
"bcrypt",
"ruamel.yaml",
"PyYAML",
"tld",
"passlib",
"requests",
]
[project.optional-dependencies]
audio = [
"simpleaudio",
]
[tool.setuptools]
# Non-src layout: explicitly control packaged modules
packages = { find = { where = ["."], include = [
"cli*",
"filter_plugins*",
"lookup_plugins*",
"module_utils*",
"library*",
], exclude = [
"roles*",
"assets*",
"docs*",
"templates*",
"logs*",
"tasks*",
"tests*",
"__pycache__*",
] } }
include-package-data = true

View File

@@ -1,9 +0,0 @@
colorscheme-generator @ https://github.com/kevinveenbirkenbach/colorscheme-generator/archive/refs/tags/v0.3.0.zip
numpy
bcrypt
ruamel.yaml
tld
passlib
requests
ansible
pytest

View File

@@ -1,9 +1,4 @@
collections: collections:
- name: kewlfft.aur - name: kewlfft.aur
- name: community.general - name: community.general
- name: hetzner.hcloud - name: hetzner.hcloud
yay:
- python-simpleaudio
- python-numpy
pacman:
- ansible

2
roles/dev-nix/TODO.md Normal file
View File

@@ -0,0 +1,2 @@
# to-dos
- Implement better hash validation for security

View File

@@ -1,14 +1,22 @@
--- ---
# Path to the installer script inside this role # Nix version to install via official installer
dev_nix_installer_source: "nix-install.sh" dev_nix_installer_version: "2.32.4"
# Path where the installer will be copied on the target host # Base URL for Nix releases
dev_nix_installer_base_url: "https://releases.nixos.org/nix"
# Full URL to the installer script (can be overridden if needed)
dev_nix_installer_url: >-
{{ dev_nix_installer_base_url }}/nix-{{ dev_nix_installer_version }}/install
# Full URL to the SHA256 checksum file
dev_nix_installer_sha256_url: "{{ dev_nix_installer_url }}.sha256"
# Path where the installer will be downloaded on the target host
dev_nix_installer_dest: "/usr/local/share/nix-install.sh" dev_nix_installer_dest: "/usr/local/share/nix-install.sh"
# Expected SHA256 of the installer file. # Will be filled at runtime from dev_nix_installer_sha256_url
# You MUST set this to the actual hash of files/nix-install.sh, e.g.: dev_nix_installer_sha256: ""
# sha256sum roles/dev-nix/files/nix-install.sh
dev_nix_installer_sha256: "CHANGE_ME_SHA256_OF_INSTALLER"
# Whether to drop a small shell snippet into /etc/profile.d to ensure # Whether to drop a small shell snippet into /etc/profile.d to ensure
# Nix environment is available for login shells. # Nix environment is available for login shells.
@@ -16,3 +24,11 @@ dev_nix_enable_shell_snippet: false
# Path of the profile.d snippet # Path of the profile.d snippet
dev_nix_shell_snippet_path: "/etc/profile.d/nix.sh" dev_nix_shell_snippet_path: "/etc/profile.d/nix.sh"
# Enable experimental features such as nix-command and flakes
dev_nix_enable_experimental_features: true
# List of experimental features to enable when dev_nix_enable_experimental_features is true
dev_nix_experimental_features:
- nix-command
- flakes

View File

@@ -0,0 +1,49 @@
---
# Install Nix differently depending on the target platform:
# - Arch-based systems: install via package manager
# - Non-Arch systems: use the official installer with SHA256 verification
# 1) Arch-based systems: just install the distro package
- name: Install Nix via package manager on Arch-based systems
community.general.pacman:
name: nix
state: present
become: true
when: ansible_facts.os_family == "Archlinux"
# 2) Non-Arch systems: delegate installer logic to a separate task file
- name: Include non-Arch installer logic
ansible.builtin.include_tasks: 02_non_arch_installer.yml
when: ansible_facts.os_family != "Archlinux"
# 3) Configure Nix experimental features (common for all platforms)
- name: Ensure Nix config directory exists
ansible.builtin.file:
path: /etc/nix
state: directory
mode: "0755"
when: dev_nix_enable_experimental_features | bool
become: true
- name: Deploy Nix configuration (nix.conf)
ansible.builtin.template:
src: "nix.conf.j2"
dest: "/etc/nix/nix.conf"
mode: "0644"
become: true
# 4) Optionally drop shell snippet for Nix
- name: Optionally drop shell snippet for Nix
ansible.builtin.copy:
dest: "{{ dev_nix_shell_snippet_path }}"
mode: "0644"
content: |
# Added by dev-nix Ansible role
if [ -e /nix/var/nix/profiles/default/etc/profile.d/nix-daemon.sh ]; then
. /nix/var/nix/profiles/default/etc/profile.d/nix-daemon.sh
fi
when: dev_nix_enable_shell_snippet | bool
become: true
# 5) Mark this role as "run once" in your global once-flag system
- include_tasks: utils/once/flag.yml

View File

@@ -0,0 +1,37 @@
---
# Non-Arch installer logic:
# Download the official Nix installer and its SHA256 from releases.nixos.org
# and run the daemon (multi-user) installer.
# 1) Fetch the official SHA256 from releases.nixos.org on the control node
- name: Fetch official Nix installer SHA256
ansible.builtin.uri:
url: "{{ dev_nix_installer_sha256_url }}"
return_content: true
register: dev_nix_official_sha_response
delegate_to: localhost
run_once: true
- name: Set expected installer checksum from official SHA256
ansible.builtin.set_fact:
dev_nix_installer_sha256: >-
{{ dev_nix_official_sha_response.content.split()[0] | trim }}
run_once: true
# 2) Download installer script on the target and verify via checksum
- name: Download Nix installer script from official releases
ansible.builtin.get_url:
url: "{{ dev_nix_installer_url }}"
dest: "{{ dev_nix_installer_dest }}"
mode: "0755"
# get_url will verify the checksum and fail if it does not match
checksum: "sha256:{{ dev_nix_installer_sha256 }}"
become: true
# 3) Run Nix installer in daemon (multi-user) mode if Nix is not installed
- name: Run Nix installer in daemon (multi-user) mode if Nix is not installed
ansible.builtin.shell: >
"{{ dev_nix_installer_dest }}" --daemon
args:
creates: "/nix/store"
become: true

View File

@@ -1,44 +0,0 @@
---
# Install Nix using a locally stored installer script with SHA256 verification.
- name: Ensure Nix installer script is present on target
ansible.builtin.copy:
src: "{{ dev_nix_installer_source }}"
dest: "{{ dev_nix_installer_dest }}"
mode: "0755"
become: true
- name: Verify Nix installer SHA256 checksum
ansible.builtin.command: >
sh -c "sha256sum '{{ dev_nix_installer_dest }}' | awk '{print $1}'"
register: dev_nix_checksum_result
changed_when: false
become: true
- name: Fail if Nix installer checksum does not match
ansible.builtin.fail:
msg: >-
Nix installer checksum mismatch.
Expected '{{ dev_nix_installer_sha256 }}', got '{{ dev_nix_checksum_result.stdout }}'.
Refusing to execute the installer.
when: dev_nix_checksum_result.stdout != dev_nix_installer_sha256
# Nix multi-user (daemon) mode: creates /nix/store when successful.
- name: Run Nix installer in daemon (multi-user) mode if Nix is not installed
ansible.builtin.shell: >
"{{ dev_nix_installer_dest }}" --daemon
args:
creates: "/nix/store"
become: true
- name: Optionally drop shell snippet for Nix
ansible.builtin.copy:
dest: "{{ dev_nix_shell_snippet_path }}"
mode: "0644"
content: |
# Added by dev-nix Ansible role
if [ -e /nix/var/nix/profiles/default/etc/profile.d/nix-daemon.sh ]; then
. /nix/var/nix/profiles/default/etc/profile.d/nix-daemon.sh
fi
when: dev_nix_enable_shell_snippet | bool
become: true

View File

@@ -1,5 +1,3 @@
--- ---
# Main entrypoint for the dev-nix role - include_tasks: 01_core.yml
when: run_once_dev_nix is not defined
- name: Include installation tasks for Nix
ansible.builtin.include_tasks: install.yml

View File

@@ -0,0 +1,12 @@
# Nix configuration file
# Managed by the {{ SOFTWARE_NAME }}dev-nix Ansible role
# Unix group containing the Nix build user accounts
build-users-group = nixbld
# Enable experimental features if configured
{% if dev_nix_enable_experimental_features %}
experimental-features = {{ dev_nix_experimental_features | join(" ") }}
{% endif %}
# (Optional) Add more global nix.conf options below

View File

@@ -6,7 +6,7 @@
- name: update pkgmgr - name: update pkgmgr
shell: | shell: |
source ~/.venvs/pkgmgr/bin/activate source ~/.venvs/pkgmgr/bin/activate
pkgmgr update pkgmgr pkgmgr update pkgmgr --clone-mode shallow
register: pkgmgr_update register: pkgmgr_update
changed_when: "'already up to date' not in (pkgmgr_update.stdout | lower)" changed_when: "'already up to date' not in (pkgmgr_update.stdout | lower)"

View File

@@ -2,9 +2,9 @@
include_role: include_role:
name: '{{ item }}' name: '{{ item }}'
loop: loop:
- dev-git - dev-git
- dev-make - dev-make
- dev-python-yaml - dev-nix
- name: Ensure OpenSSH client is installed - name: Ensure OpenSSH client is installed
community.general.pacman: community.general.pacman:
@@ -27,7 +27,21 @@
mode: '0755' mode: '0755'
become: true become: true
- name: Clone Kevin's Package Manager repository - name: Check if pkgmgr git repo already exists
stat:
path: "{{ PKGMGR_INSTALL_PATH }}/.git"
register: pkgmgr_git_repo
become: true
- name: Remove legacy 'latest' tag from existing pkgmgr repo (if present)
command: git tag -d latest
args:
chdir: "{{ PKGMGR_INSTALL_PATH }}"
when: pkgmgr_git_repo.stat.exists
ignore_errors: true
become: true
- name: Clone Kevin's Package Manager repository (always latest HEAD)
git: git:
repo: "{{ PKGMGR_REPO_URL }}" repo: "{{ PKGMGR_REPO_URL }}"
dest: "{{ PKGMGR_INSTALL_PATH }}" dest: "{{ PKGMGR_INSTALL_PATH }}"
@@ -52,7 +66,7 @@
become: true become: true
- name: "Update all repositories with pkgmgr" - name: "Update all repositories with pkgmgr"
command: "pkgmgr pull --all" command: "pkgmgr update --all --clone-mode shallow"
when: MODE_UPDATE | bool when: MODE_UPDATE | bool
- include_tasks: utils/once/flag.yml - include_tasks: utils/once/flag.yml

View File

@@ -1,29 +0,0 @@
# Installation Guide
1. **Navigate to the Docker Compose Directory**
Change into the directory where the Docker Compose files reside.
```bash
cd {{ PATH_DOCKER_COMPOSE_INSTANCES }}akaunting/
```
2. **Set Environment Variables**
Ensure timeouts are increased to handle long operations:
```bash
export COMPOSE_HTTP_TIMEOUT=600
export DOCKER_CLIENT_TIMEOUT=600
```
3. **Start Akaunting Service**
Run the setup command with the `AKAUNTING_SETUP` variable:
```bash
AKAUNTING_SETUP=true docker-compose -p akaunting up -d
```
4. **Finalizing Setup**
After verifying that the web interface works, restart services:
```bash
docker-compose down
docker-compose -p akaunting up -d
```
For further details, visit the [Akaunting Documentation](https://akaunting.com/) and the [Akaunting GitHub Repository](https://github.com/akaunting/docker).

View File

@@ -1,29 +0,0 @@
# Administration
## track docker container status
```bash
watch -n 2 "docker ps -a | grep peertube"
```
## clean rebuild
```bash
cd {{ PATH_DOCKER_COMPOSE_INSTANCES }}peertube/ &&
docker-compose down
docker volume rm peertube_assets peertube_config peertube_data peertube_database peertube_redis
docker-compose up -d
```
## access terminal
```bash
docker-compose exec -it application /bin/bash
```
## update config
```bash
apt update && apt install nano && nano ./config/default.yaml
```
## get root pasword
```bash
docker logs peertube-application-1 | grep -A1 root
```

View File

@@ -5,4 +5,4 @@
- name: "configure pgadmin servers" - name: "configure pgadmin servers"
include_tasks: configuration.yml include_tasks: configuration.yml
when: applications | get_app_conf(application_id, 'server_mode', True) | bool when: applications | get_app_conf(application_id, 'server_mode') | bool

View File

@@ -3,6 +3,6 @@
name: sys-stk-full-stateless name: sys-stk-full-stateless
vars: vars:
docker_compose_flush_handlers: true docker_compose_flush_handlers: true
docker_git_repository_address: "https://github.com/kevinveenbirkenbach/roulette-wheel.git" docker_git_repository_address: "https://github.com/kevinveenbirkenbach/roulette-wheel.git"
docker_git_repository_pull: true docker_git_repository_pull: true
docker_git_repository_branch: "master" docker_git_repository_branch: "master"

View File

@@ -16,6 +16,8 @@
- name: "load docker, proxy for '{{ application_id }}'" - name: "load docker, proxy for '{{ application_id }}'"
include_role: include_role:
name: sys-stk-full-stateless name: sys-stk-full-stateless
vars:
application_id: "web-app-sphinx"
# Hack because it wasn't possible to fix an handler bug in pkgmgr install # Hack because it wasn't possible to fix an handler bug in pkgmgr install
- name: „Trigger“ docker compose up - name: „Trigger“ docker compose up

View File

@@ -1,71 +0,0 @@
# tests/integration/test_no_module_redirections_in_logs.py
import os
import glob
import re
import unittest
from collections import defaultdict
REDIR_RE = re.compile(r"redirecting \(type: modules\)\s+(\S+)\s+to\s+(\S+)", re.IGNORECASE)
class ModuleRedirectionLogTest(unittest.TestCase):
"""
Fail if logs/*.log contains Ansible module redirections like:
'redirecting (type: modules) ansible.builtin.pacman to community.general.pacman'
Rationale: These lookups add overhead and clutter logs. Use fully-qualified
collection names directly in tasks to improve performance and clarity.
"""
def test_no_module_redirections(self):
project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", ".."))
log_glob = os.path.join(project_root, "logs", "*.log")
files = sorted(glob.glob(log_glob))
if not files:
self.skipTest(f"No log files found at {log_glob}")
hits = []
mappings = defaultdict(int)
for path in files:
try:
with open(path, "r", encoding="utf-8", errors="replace") as fh:
for lineno, line in enumerate(fh, 1):
m = REDIR_RE.search(line)
if m:
src, dst = m.group(1), m.group(2)
hits.append((path, lineno, src, dst, line.strip()))
mappings[(src, dst)] += 1
except OSError as e:
self.fail(f"Cannot read log file {path}: {e}")
if hits:
# Build helpful failure message
suggestions = []
regex_hints = []
for (src, dst), count in sorted(mappings.items(), key=lambda x: (-x[1], x[0])):
suggestions.append(f"- Replace '{src}' with '{dst}' in your tasks ({count} occurrences).")
# Create VS Code regex for finding these in YAML
src_name = re.escape(src.split('.')[-1]) # only short module name
regex_hints.append(f"(?<!{re.escape(dst.rsplit('.',1)[0])}\\.){src_name}:")
examples = []
for i, (path, lineno, src, dst, text) in enumerate(hits[:10], 1):
examples.append(f"{i:02d}. {path}:{lineno}: {text}")
msg = (
f"Found {len(hits)} Ansible module redirections in logs/*.log.\n"
f"These slow down execution and clutter logs. "
f"Use fully-qualified module names to avoid runtime redirection.\n\n"
f"Suggested replacements:\n"
+ "\n".join(suggestions)
+ "\n\nExamples:\n"
+ "\n".join(examples)
+ "\n\nVS Code regex to find each occurrence in your code:\n"
+ "\n".join(f"- {hint}" for hint in sorted(set(regex_hints)))
+ "\n\nExample fix:\n"
f" # Instead of:\n"
f" pacman:\n"
f" # Use:\n"
f" community.general.pacman:\n"
)
self.fail(msg)

View File

@@ -234,8 +234,8 @@ class TestRunAnsiblePlaybook(unittest.TestCase):
"Expected 'make messy-build' when skip_build=False", "Expected 'make messy-build' when skip_build=False",
) )
self.assertTrue( self.assertTrue(
any(call == ["make", "messy-test"] for call in calls), any(call == ["make", "test-messy"] for call in calls),
"Expected 'make messy-test' when skip_tests=False", "Expected 'make test-messy' when skip_tests=False",
) )
self.assertTrue( self.assertTrue(
any( any(
@@ -330,7 +330,7 @@ class TestRunAnsiblePlaybook(unittest.TestCase):
# No cleanup, no build, no tests, no inventory validation # No cleanup, no build, no tests, no inventory validation
self.assertFalse(any(call == ["make", "clean"] for call in calls)) self.assertFalse(any(call == ["make", "clean"] for call in calls))
self.assertFalse(any(call == ["make", "messy-build"] for call in calls)) self.assertFalse(any(call == ["make", "messy-build"] for call in calls))
self.assertFalse(any(call == ["make", "messy-test"] for call in calls)) self.assertFalse(any(call == ["make", "test-messy"] for call in calls))
self.assertFalse( self.assertFalse(
any( any(
isinstance(call, list) isinstance(call, list)

View File

@@ -10,7 +10,7 @@ import subprocess
class TestGenerateDefaultApplications(unittest.TestCase): class TestGenerateDefaultApplications(unittest.TestCase):
def setUp(self): def setUp(self):
# Path to the generator script under test # Path to the generator script under test
self.script_path = Path(__file__).resolve().parents[5] / "cli" / "build" / "defaults" / "applications.py" self.script_path = Path(__file__).resolve().parents[4] / "cli" / "setup" / "applications.py"
# Create temp role structure # Create temp role structure
self.temp_dir = Path(tempfile.mkdtemp()) self.temp_dir = Path(tempfile.mkdtemp())
self.roles_dir = self.temp_dir / "roles" self.roles_dir = self.temp_dir / "roles"
@@ -32,7 +32,7 @@ class TestGenerateDefaultApplications(unittest.TestCase):
shutil.rmtree(self.temp_dir) shutil.rmtree(self.temp_dir)
def test_script_generates_expected_yaml(self): def test_script_generates_expected_yaml(self):
script_path = Path(__file__).resolve().parent.parent.parent.parent.parent.parent / "cli/build/defaults/applications.py" script_path = Path(__file__).resolve().parent.parent.parent.parent.parent.parent / "cli/setup/applications.py"
result = subprocess.run( result = subprocess.run(
[ [

View File

@@ -45,7 +45,7 @@ class TestGenerateDefaultApplicationsUsers(unittest.TestCase):
When a users.yml exists with defined users, the script should inject a 'users' When a users.yml exists with defined users, the script should inject a 'users'
mapping in the generated YAML, mapping each username to a Jinja2 reference. mapping in the generated YAML, mapping each username to a Jinja2 reference.
""" """
script_path = Path(__file__).resolve().parents[5] / "cli" / "build/defaults/applications.py" script_path = Path(__file__).resolve().parents[4] / "cli" / "setup/applications.py"
result = subprocess.run([ result = subprocess.run([
"python3", str(script_path), "python3", str(script_path),
"--roles-dir", str(self.roles_dir), "--roles-dir", str(self.roles_dir),

View File

@@ -7,7 +7,7 @@ import yaml
from collections import OrderedDict from collections import OrderedDict
# Add cli/ to import path # Add cli/ to import path
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), "../../../../..", "cli/build/defaults/"))) sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), "../../../../..", "cli/setup/")))
import users import users
@@ -159,7 +159,7 @@ class TestGenerateUsers(unittest.TestCase):
out_file = tmpdir / "users.yml" out_file = tmpdir / "users.yml"
# Resolve script path like in other tests (relative to repo root) # Resolve script path like in other tests (relative to repo root)
script_path = Path(__file__).resolve().parents[5] / "cli" / "build" / "defaults" / "users.py" script_path = Path(__file__).resolve().parents[4] / "cli" / "setup" / "users.py"
# Run generator # Run generator
result = subprocess.run( result = subprocess.run(
@@ -215,7 +215,7 @@ class TestGenerateUsers(unittest.TestCase):
yaml.safe_dump({"users": users_map}, f) yaml.safe_dump({"users": users_map}, f)
out_file = tmpdir / "users.yml" out_file = tmpdir / "users.yml"
script_path = Path(__file__).resolve().parents[5] / "cli" / "build" / "defaults" / "users.py" script_path = Path(__file__).resolve().parents[5] / "cli" / "setup" / "users.py"
# First run # First run
r1 = subprocess.run( r1 = subprocess.run(
@@ -303,7 +303,7 @@ class TestGenerateUsers(unittest.TestCase):
) )
out_file = tmpdir / "users.yml" out_file = tmpdir / "users.yml"
script_path = Path(__file__).resolve().parents[5] / "cli" / "build" / "defaults" / "users.py" script_path = Path(__file__).resolve().parents[5] / "cli" / "setup" / "users.py"
result = subprocess.run( result = subprocess.run(
[ [

View File

@@ -69,7 +69,7 @@ class TestMainHelpers(unittest.TestCase):
""" """
available = [ available = [
(None, "deploy"), (None, "deploy"),
("build/defaults", "users"), ("setup", "users"),
] ]
main.show_full_help_for_all("/fake/cli", available) main.show_full_help_for_all("/fake/cli", available)