mirror of
https://github.com/kevinveenbirkenbach/computer-playbook.git
synced 2025-04-22 16:02:24 +02:00
Compare commits
No commits in common. "9107c1926a402f861536f14be756f4c53e5b2f5f" and "d1d19830b0beee69bd449a8436a095b81d2d0ee1" have entirely different histories.
9107c1926a
...
d1d19830b0
@ -1,4 +1,4 @@
|
||||
# Features
|
||||
# Features 🚀
|
||||
|
||||
**CyMaIS - Cyber Master Infrastructure Solution** revolutionizes IT infrastructure management, making it simpler, safer, and more adaptable for businesses of all sizes. Here’s how it can benefit your organization:
|
||||
|
@ -2,7 +2,33 @@
|
||||
|
||||
CyMaIS is designed with security in mind. However, while following our guidelines can greatly improve your system’s security, no IT system can be 100% secure. Please report any vulnerabilities as soon as possible.
|
||||
|
||||
Additional to the user securitry guidelines administrators have additional responsibilities to secure the entire system:
|
||||
---
|
||||
|
||||
## For End Users
|
||||
|
||||
For optimal personal security, we **strongly recommend** the following:
|
||||
|
||||
- **Use a Password Manager**
|
||||
Use a reliable password manager such as [KeePass](https://keepass.info/) 🔐. (Learn more about [password managers](https://en.wikipedia.org/wiki/Password_manager) on Wikipedia.) KeePass is available for both smartphones and PCs, and it can automatically generate strong, random passwords.
|
||||
|
||||
- **Enable Two-Factor Authentication (2FA)**
|
||||
Always enable 2FA whenever possible. Many password managers (like KeePass) can generate [TOTP](https://en.wikipedia.org/wiki/Time-based_One-Time_Password) tokens, adding an extra layer of security even if your password is compromised.
|
||||
Synchronize your password database across devices using the [Nextcloud Client](https://nextcloud.com/) 📱💻.
|
||||
|
||||
- **Use Encrypted Systems**
|
||||
We recommend running CyMaIS only on systems with full disk encryption. For example, Linux distributions such as [Manjaro](https://manjaro.org/) (based on ArchLinux) with desktop environments like [GNOME](https://en.wikipedia.org/wiki/GNOME) provide excellent security. (Learn more about [disk encryption](https://en.wikipedia.org/wiki/Disk_encryption) on Wikipedia.)
|
||||
|
||||
- **Beware of Phishing and Social Engineering**
|
||||
Always verify email senders, avoid clicking on unknown links, and never share your passwords or 2FA codes with anyone. (Learn more about [Phishing](https://en.wikipedia.org/wiki/Phishing) and [Social Engineering](https://en.wikipedia.org/wiki/Social_engineering_(security)) on Wikipedia.)
|
||||
|
||||
Following these guidelines will significantly enhance your personal security—but remember, no system is completely immune to risk.
|
||||
|
||||
A tutorial how to setup secure password management you will find [here](https://blog.veen.world/blog/2025/04/04/%f0%9f%9b%a1%ef%b8%8f-keepassxc-cymais-cloud-the-ultimate-guide-to-cross-device-password-security/)
|
||||
---
|
||||
|
||||
## For Administrators
|
||||
|
||||
Administrators have additional responsibilities to secure the entire system:
|
||||
|
||||
- **Deploy on an Encrypted Server**
|
||||
It is recommended to install CyMaIS on an encrypted server to prevent hosting providers from accessing end-user data. For a practical guide on setting up an encrypted server, refer to the [Hetzner Arch LUKS repository](https://github.com/kevinveenbirkenbach/hetzner-arch-luks) 🔐. (Learn more about [disk encryption](https://en.wikipedia.org/wiki/Disk_encryption) on Wikipedia.)
|
@ -1,4 +1,4 @@
|
||||
# User Guide
|
||||
# User Guide 📖
|
||||
|
||||
Welcome to **CyMaIS**! This guide is designed for **end-users** who want to use cloud services, email, and collaboration tools securely and efficiently. Whether you're an **enterprise user** or an **individual**, CyMaIS provides a wide range of services tailored to your needs.
|
||||
|
@ -1,4 +1,4 @@
|
||||
# Customer Guide
|
||||
# Customer Guide 📋
|
||||
|
||||
Are you looking for a **reliable IT infrastructure** for your business or organization? **CyMaIS** is here to help!
|
||||
|
@ -1,12 +1,12 @@
|
||||
# Administrator Guide
|
||||
# Administrator Guide 🖥️
|
||||
|
||||
This guide is for **system administrators** who are deploying and managing CyMaIS infrastructure.
|
||||
|
||||
## Setting Up CyMaIS 🏗️
|
||||
Follow these guides to install and configure CyMaIS:
|
||||
- [Setup Guide](SETUP_GUIDE.md)
|
||||
- [Configuration Guide](CONFIGURATION.md)
|
||||
- [Deployment Guide](DEPLOY.md)
|
||||
- [Setup Guide](07_SETUP_GUIDE.md)
|
||||
- [Configuration Guide](08_CONFIGURATION.md)
|
||||
- [Deployment Guide](09_DEPLOY.md)
|
||||
|
||||
## Key Responsibilities 🔧
|
||||
- **User Management** - Configure LDAP, Keycloak, and user permissions.
|
@ -7,8 +7,8 @@ Explore CyMaIS Solutions
|
||||
------------------------
|
||||
CyMaIS offers various solutions for IT infrastructure automation. Learn more about the available applications:
|
||||
|
||||
- :doc:`../../../roles/application_glosar`
|
||||
- :doc:`../../../roles/application_categories`
|
||||
- :doc:`roles/application_glosar`
|
||||
- :doc:`roles/application_categories`
|
||||
|
||||
For Developers
|
||||
--------------
|
||||
@ -18,7 +18,7 @@ Understanding Ansible Roles
|
||||
|
||||
CyMaIS is powered by **Ansible** roles to automate deployments. Developers can explore the technical details of our roles here:
|
||||
|
||||
- :doc:`../../../roles/ansible_role_glosar`
|
||||
- :doc:`roles/ansible_role_glosar`
|
||||
|
||||
Contributing to CyMaIS
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
@ -37,8 +37,8 @@ Contribution Guidelines
|
||||
|
||||
For detailed guidelines, refer to:
|
||||
|
||||
- :doc:`../../../CONTRIBUTING`
|
||||
- :doc:`../../../CODE_OF_CONDUCT`
|
||||
- :doc:`CONTRIBUTING`
|
||||
- :doc:`CODE_OF_CONDUCT`
|
||||
|
||||
Community & Support
|
||||
-------------------
|
@ -1,4 +1,4 @@
|
||||
# Enterprise Solutions
|
||||
# Enterprise Solutions 🏢
|
||||
|
||||
**CyMaIS** provides powerful **enterprise-grade IT infrastructure solutions**, enabling businesses to scale securely and efficiently.
|
||||
|
@ -1,4 +1,4 @@
|
||||
# Investor Guide
|
||||
# Investor Information 💰
|
||||
|
||||
🚀 **CyMaIS is seeking investors** to expand its reach and continue development. With an increasing demand for automated IT solutions, **CyMaIS has the potential to revolutionize IT infrastructure management.**
|
||||
|
@ -1,4 +1,4 @@
|
||||
# Contact
|
||||
# Author
|
||||
|
||||
<img src="https://cybermaster.space/wp-content/uploads/sites/7/2023/11/FVG_8364BW-scaled.jpg" width="300" style="float: right; margin-left: 30px;">
|
||||
|
10
README.md
10
README.md
@ -11,11 +11,9 @@ Welcome to **CyMaIS (Cyber Master Infrastructure Solution)**, a powerful automat
|
||||
CyMaIS leverages **Docker, Linux, and Ansible** to provide an automated and modular infrastructure solution. With more then **150 pre-configured roles**, it supports a wide range of applications, from cloud services to local server management and desktop workstation setups.
|
||||
|
||||
## Guides 📖
|
||||
- **[User Guide](docs/guides/user/Readme.md)** - For end-users accessing cloud apps like Nextcloud, Matrix, and more.
|
||||
- **[Administrator Guide](docs/guides/administrator/Readme.md)** - For system administrators deploying CyMaIS.
|
||||
- **[Customer Guide](docs/guides/customer/Readme.md)** - For customers which are interested in an infrastructure setup
|
||||
- **[Developer Guide](docs/guides/developer/index)** - For developers which are interested in participating
|
||||
- **[Investor Guide](docs/guides/investor/Readme.md)** - For investors which like to get a share in the project
|
||||
- **[User Guide](04_USER_GUIDE.md)** - For end-users accessing cloud apps like Nextcloud, Matrix, and more.
|
||||
- **[Administrator Guide](06_ADMINISTRATOR_GUIDE.md)** - For system administrators deploying CyMaIS.
|
||||
- **[Customer Guide](05_CUSTOMER_GUIDE.md)** - For customers which are interested in an infrastructure setup
|
||||
|
||||
## Key Features 🎯
|
||||
- **Automated IT deployment** 📦 - Pre-built roles for server and PC setups
|
||||
@ -24,7 +22,7 @@ CyMaIS leverages **Docker, Linux, and Ansible** to provide an automated and modu
|
||||
- **Backup & recovery solutions** 💾 - Automate data security and prevent loss
|
||||
- **Infrastructure monitoring & maintenance** 📊 - Keep your system running optimally
|
||||
|
||||
More informations about the features you will find [here](docs/overview/Features.md).
|
||||
More informations about the features you will find [here](01_FEATURES.md).
|
||||
|
||||
## Get Started 🚀
|
||||
1. **Install CyMaIS** via [Kevin's Package Manager](https://github.com/kevinveenbirkenbach/package-manager)
|
||||
|
8
docs/.gitignore
vendored
Normal file
8
docs/.gitignore
vendored
Normal file
@ -0,0 +1,8 @@
|
||||
assets/img/*
|
||||
!assets/img/.gitkeep
|
||||
output/*
|
||||
!output/.gitkeep
|
||||
generated/*
|
||||
!generated/.gitkeep
|
||||
requirements/*
|
||||
!requirements/.gitkeep
|
41
docs/Dockerfile
Normal file
41
docs/Dockerfile
Normal file
@ -0,0 +1,41 @@
|
||||
ARG DOCKER_PYTHON_VERSION
|
||||
FROM python:${DOCKER_PYTHON_VERSION}
|
||||
|
||||
ARG SPHINX_SOURCE_DIR
|
||||
ARG SPHINX_OUTPUT_DIR
|
||||
ARG SPHINX_EXEC_DIR
|
||||
ARG SPHINX_DOCKER_EXEC_DIR
|
||||
ARG SPHINX_SOURCE_DIR_RELATIVE
|
||||
|
||||
# Set the environment variables so they are available during build for Makefile
|
||||
ENV SPHINX_SOURCE_DIR=${SPHINX_SOURCE_DIR}
|
||||
ENV SPHINX_OUTPUT_DIR=${SPHINX_OUTPUT_DIR}
|
||||
ENV SPHINX_REQUIREMENTS_DIR=${SPHINX_EXEC_DIR}/requirements
|
||||
|
||||
|
||||
# Set the working directory
|
||||
WORKDIR ${SPHINX_DOCKER_EXEC_DIR}
|
||||
|
||||
# Update and install make
|
||||
RUN apt-get update && apt install -y make
|
||||
|
||||
# Copy the project files into the container
|
||||
COPY ${SPHINX_SOURCE_DIR_RELATIVE} ${SPHINX_DOCKER_EXEC_DIR}
|
||||
|
||||
# Build the requirement files
|
||||
RUN cd ${SPHINX_EXEC_DIR} && make extract-requirements
|
||||
|
||||
# Install required packages
|
||||
RUN xargs -a ${SPHINX_REQUIREMENTS_DIR}/apt.txt apt install -y
|
||||
|
||||
# Install Python packages via requirements.txt
|
||||
RUN pip install --upgrade pip && pip install -r ${SPHINX_REQUIREMENTS_DIR}/pip.txt
|
||||
|
||||
# Build the HTML documentation using Sphinx with the defined directories
|
||||
RUN cd ${SPHINX_EXEC_DIR} && make html
|
||||
|
||||
# Expose port 8000 where the HTTP server will run
|
||||
EXPOSE 8000
|
||||
|
||||
# Start a simple HTTP server to serve the built documentation
|
||||
CMD python -m http.server 8000 --directory "${SPHINX_OUTPUT_DIR}html/"
|
80
docs/Makefile
Normal file
80
docs/Makefile
Normal file
@ -0,0 +1,80 @@
|
||||
# PARAMETER (with default values)
|
||||
|
||||
# Directory which cointains the Makefile
|
||||
SPHINX_EXEC_DIR ?= .
|
||||
|
||||
# Directory from which the sources will be read
|
||||
SPHINX_SOURCE_DIR ?= ../
|
||||
|
||||
# Directory which contains the builded files
|
||||
SPHINX_OUTPUT_DIR ?= ./output
|
||||
|
||||
# Args parsed to the sphinx-build command
|
||||
SPHINXOPTS ?= -c $(SPHINX_EXEC_DIR)
|
||||
|
||||
# CONSTANTS
|
||||
|
||||
# Sphinx build command
|
||||
SPHINX_BUILD_COMMAND = sphinx-build
|
||||
|
||||
# Directory which contains the auto generated files
|
||||
SPHINX_GENERATED_DIR = $(SPHINX_OUTPUT_DIR)/../generated
|
||||
|
||||
# Directory which contains the extracted requirement files
|
||||
SPHINX_REQUIREMENTS_DIR = $(SPHINX_EXEC_DIR)/requirements
|
||||
|
||||
.PHONY: help install copy-images apidoc remove-generated html generate extract-requirements Makefile
|
||||
|
||||
extract-requirements:
|
||||
@echo "Creating requirement files"
|
||||
bash ./scripts/extract-requirements.sh "$(SPHINX_EXEC_DIR)/requirements.yml" "$(SPHINX_REQUIREMENTS_DIR)/apt.txt" "$(SPHINX_REQUIREMENTS_DIR)/pip.txt"
|
||||
|
||||
# Copy images before running any Sphinx command (except for help)
|
||||
copy-images:
|
||||
@echo "Copying images from ../assets/img/ to ./assets/img/..."
|
||||
cp -vr ../assets/img/* ./assets/img/
|
||||
|
||||
# Generate reStructuredText files from Python modules using sphinx-apidoc
|
||||
generate-apidoc:
|
||||
@echo "Running sphinx-apidoc..."
|
||||
sphinx-apidoc -f -o $(SPHINX_GENERATED_DIR)/modules $(SPHINX_SOURCE_DIR)
|
||||
|
||||
generate-yaml-index:
|
||||
@echo "Generating YAML index..."
|
||||
python generators/yaml_index.py --source-dir $(SPHINX_SOURCE_DIR) --output-file $(SPHINX_GENERATED_DIR)/yaml_index.rst
|
||||
|
||||
generate-ansible-roles:
|
||||
@echo "Generating Ansible roles documentation..."
|
||||
python generators/ansible_roles.py --roles-dir $(SPHINX_SOURCE_DIR)/roles --output-dir $(SPHINX_GENERATED_DIR)/roles
|
||||
@echo "Generating Ansible roles index..."
|
||||
python generators/index.py --roles-dir generated/roles --output-file $(SPHINX_SOURCE_DIR)/roles/ansible_role_glosar.rst --caption "Ansible Role Glosar"
|
||||
|
||||
generate-readmes:
|
||||
@echo "Create required README.md's for index..."
|
||||
python generators/readmes.py --generated-dir ./$(SPHINX_GENERATED_DIR)
|
||||
|
||||
generate: generate-apidoc generate-yaml-index generate-ansible-roles generate-readmes
|
||||
|
||||
|
||||
remove-generated:
|
||||
@echo "Removing generated files..."
|
||||
- find $(SPHINX_GENERATED_DIR)/ -type f ! -name '.gitkeep' -delete
|
||||
|
||||
help:
|
||||
@$(SPHINX_BUILD_COMMAND) -M help "$(SPHINX_SOURCE_DIR)" "$(SPHINX_OUTPUT_DIR)" $(SPHINXOPTS) $(O)
|
||||
|
||||
html: copy-images generate
|
||||
@echo "Building Sphinx documentation..."
|
||||
$(SPHINX_BUILD_COMMAND) -M html "$(SPHINX_SOURCE_DIR)" "$(SPHINX_OUTPUT_DIR)" $(SPHINXOPTS)
|
||||
|
||||
just-html:
|
||||
@$(SPHINX_BUILD_COMMAND) -M html "$(SPHINX_SOURCE_DIR)" "$(SPHINX_OUTPUT_DIR)" $(SPHINXOPTS)
|
||||
|
||||
|
||||
clean: remove-generated
|
||||
@$(SPHINX_BUILD_COMMAND) -M clean "$(SPHINX_SOURCE_DIR)" "$(SPHINX_OUTPUT_DIR)" $(SPHINXOPTS) $(O)
|
||||
|
||||
# Catch-all target: route all unknown targets to Sphinx using the new
|
||||
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
|
||||
%: Makefile
|
||||
@$(SPHINX_BUILD_COMMAND) -M $@ "$(SPHINX_SOURCE_DIR)" "$(SPHINX_OUTPUT_DIR)" $(SPHINXOPTS) $(O)
|
56
docs/README.md
Normal file
56
docs/README.md
Normal file
@ -0,0 +1,56 @@
|
||||
# Documentation
|
||||
|
||||
CyMaIS uses [Sphinx](https://www.sphinx-doc.org/) to automatically generate its documentation and leverages the [Awesome Sphinx Theme](https://sphinxawesome.xyz/) for a sleek and responsive design. Enjoy a seamless, visually engaging experience 🚀✨.
|
||||
|
||||
## For Users
|
||||
|
||||
You can access the documentation [here](https://docs.cymais.cloud/) 🔗. Browse the latest updates and guides to get started.
|
||||
|
||||
## For Administrators
|
||||
|
||||
### Setup
|
||||
|
||||
#### On Localhost
|
||||
|
||||
To generate the documentation locally, run the following command:
|
||||
|
||||
```bash
|
||||
pkgmgr shell cymais -c "make refresh"
|
||||
```
|
||||
|
||||
This command performs the following steps:
|
||||
- **Copy Images:** Before building, it copies the necessary image assets from `../assets/img/` to `./assets/img/` using the `copy-images` target.
|
||||
- **Generate API Documentation:** It executes `sphinx-apidoc` (via the `apidoc` target) to automatically generate reStructuredText files for all Python modules. These files are stored under a designated directory (e.g., `modules`), ensuring that every Python file is included in the documentation.
|
||||
- **Build HTML Documentation:** Finally, it builds the HTML documentation using `sphinx-build` (triggered by the `html` target).
|
||||
|
||||
Once complete, you can view the documentation at the output location (e.g., [templates/html/index.html](templates/html/index.html)) 👀💻.
|
||||
|
||||
#### On Server
|
||||
|
||||
The same commands can be used on the server to ensure that documentation is always up to date. Make sure the server environment is properly configured with the necessary Python packages and assets.
|
||||
|
||||
### Additional Commands
|
||||
|
||||
- **`make copy-images`:**
|
||||
Copies image files from the assets directory into the local documentation directory. This ensures that all required images are available for the generated documentation.
|
||||
|
||||
- **`make apidoc`:**
|
||||
Runs `sphinx-apidoc` to scan all Python files in the source directory and generate corresponding reStructuredText files. This automates the inclusion of all Python modules into the Sphinx documentation.
|
||||
|
||||
- **`make html`:**
|
||||
This target depends on the `apidoc` target. It first generates the API documentation and then builds the HTML documentation using `sphinx-build`. This is the standard target to produce the final, viewable documentation.
|
||||
|
||||
- **`make refresh`:**
|
||||
A custom target (typically defined as a combination of cleaning the previous build and then running `make html`) that ensures the documentation is regenerated from scratch with the latest changes.
|
||||
|
||||
### Debug
|
||||
|
||||
To debug and produce a log file, execute:
|
||||
|
||||
```bash
|
||||
pkgmgr shell cymais -c "make refresh SPHINXOPTS='-v -c .' 2>&1 | tee debug.log"
|
||||
```
|
||||
|
||||
This command increases the verbosity of the Sphinx build process and redirects all output to `debug.log`, which is useful for troubleshooting any issues during the documentation build.
|
||||
|
||||
```
|
0
docs/assets/img/.gitkeep
Normal file
0
docs/assets/img/.gitkeep
Normal file
102
docs/assets/js/current-nav.js
Normal file
102
docs/assets/js/current-nav.js
Normal file
@ -0,0 +1,102 @@
|
||||
document.addEventListener("DOMContentLoaded", function() {
|
||||
// Initialization: wait for window load and then trigger current nav detection.
|
||||
window.addEventListener("load", function() {
|
||||
console.log("Window loaded, initializing current nav...");
|
||||
initCurrentNav();
|
||||
});
|
||||
|
||||
// Re-trigger when the hash changes.
|
||||
window.addEventListener("hashchange", function() {
|
||||
console.log("Hash changed, reinitializing current nav...");
|
||||
initCurrentNav();
|
||||
});
|
||||
|
||||
function initCurrentNav() {
|
||||
// If Alpine.js is available and provides nextTick, use it.
|
||||
if (window.Alpine && typeof window.Alpine.nextTick === 'function') {
|
||||
window.Alpine.nextTick(processNav);
|
||||
} else {
|
||||
processNav();
|
||||
}
|
||||
}
|
||||
|
||||
function processNav() {
|
||||
var currentHash = window.location.hash;
|
||||
console.log("initCurrentNav: Current hash:", currentHash);
|
||||
if (!currentHash) return;
|
||||
|
||||
// Select all internal links within the .current-index container.
|
||||
var links = document.querySelectorAll('.current-index a.reference.internal');
|
||||
links.forEach(function(link) {
|
||||
var href = link.getAttribute("href");
|
||||
console.log("initCurrentNav: Checking link:", href);
|
||||
// If the link is hash-only (e.g. "#setup-guide")
|
||||
if (href && href.trim().startsWith("#")) {
|
||||
if (href.trim() === currentHash.trim()) {
|
||||
console.log("initCurrentNav: Match found for hash-only link:", href);
|
||||
document.querySelectorAll('.current-index a.reference.internal.current').forEach(function(link) {
|
||||
link.classList.remove("current");
|
||||
});
|
||||
link.classList.add("current");
|
||||
markAsCurrent(link);
|
||||
}
|
||||
}
|
||||
// Otherwise, if the link includes a file and a hash, compare the hash part.
|
||||
else if (href && href.indexOf('#') !== -1) {
|
||||
var parts = href.split('#');
|
||||
var linkHash = "#" + parts[1].trim();
|
||||
console.log("initCurrentNav: Extracted link hash:", linkHash);
|
||||
if (linkHash === currentHash.trim()) {
|
||||
console.log("initCurrentNav: Match found for link with file and hash:", href);
|
||||
markAsCurrent(link);
|
||||
}
|
||||
}
|
||||
else {
|
||||
console.log("initCurrentNav: No match for link:", href);
|
||||
}
|
||||
});
|
||||
|
||||
// After processing links, open submenus only for those li elements marked as current.
|
||||
openCurrentSubmenus();
|
||||
}
|
||||
|
||||
// Mark the link's parent li and all its ancestor li elements as current.
|
||||
function markAsCurrent(link) {
|
||||
var li = link.closest("li");
|
||||
if (!li) {
|
||||
console.log("markAsCurrent: No parent li found for link:", link);
|
||||
return;
|
||||
}
|
||||
li.classList.add("current");
|
||||
console.log("markAsCurrent: Marked li as current:", li);
|
||||
// If Alpine.js is used, set its "expanded" property to true.
|
||||
if (li.__x && li.__x.$data) {
|
||||
li.__x.$data.expanded = true;
|
||||
console.log("markAsCurrent: Set Alpine expanded on li:", li);
|
||||
}
|
||||
// Propagate upward: mark all ancestor li elements as current.
|
||||
var parentLi = li.parentElement.closest("li");
|
||||
while (parentLi) {
|
||||
parentLi.classList.add("current");
|
||||
if (parentLi.__x && parentLi.__x.$data) {
|
||||
parentLi.__x.$data.expanded = true;
|
||||
}
|
||||
console.log("markAsCurrent: Propagated current to ancestor li:", parentLi);
|
||||
parentLi = parentLi.parentElement.closest("li");
|
||||
}
|
||||
}
|
||||
|
||||
// Open immediate submenu elements (the direct children with x-show) of li.current.
|
||||
function openCurrentSubmenus() {
|
||||
document.querySelectorAll('.current-index li.current').forEach(function(li) {
|
||||
// Only target immediate child elements that have x-show.
|
||||
li.querySelectorAll(":scope > [x-show]").forEach(function(elem) {
|
||||
if (elem.style.display === "none" || elem.style.display === "") {
|
||||
elem.style.display = "block";
|
||||
console.log("openCurrentSubmenus: Opened submenu element:", elem);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
window.initCurrentNav = initCurrentNav;
|
||||
});
|
108
docs/conf.py
Normal file
108
docs/conf.py
Normal file
@ -0,0 +1,108 @@
|
||||
import sys
|
||||
import logging
|
||||
|
||||
# Check if a verbose flag is present in the command line arguments.
|
||||
if any(arg in sys.argv for arg in ["-v", "--verbose"]):
|
||||
logging_level = logging.DEBUG
|
||||
else:
|
||||
logging_level = logging.INFO
|
||||
|
||||
logging.basicConfig(level=logging_level)
|
||||
|
||||
import os
|
||||
sys.path.insert(0, os.path.abspath('.'))
|
||||
|
||||
project = 'CyMaIS - Cyber Master Infrastructure Solution'
|
||||
copyright = '2025, Kevin Veen-Birkenbach'
|
||||
author = 'Kevin Veen-Birkenbach'
|
||||
|
||||
# Highlighting for Jinja
|
||||
from sphinx.highlighting import lexers
|
||||
from pygments.lexers.templates import DjangoLexer
|
||||
|
||||
lexers['jinja'] = DjangoLexer()
|
||||
lexers['j2'] = DjangoLexer()
|
||||
|
||||
# -- General configuration ---------------------------------------------------
|
||||
templates_path = ['templates']
|
||||
exclude_patterns = [
|
||||
'docs/build',
|
||||
'venv',
|
||||
'venv/**'
|
||||
]
|
||||
|
||||
# -- Options for HTML output -------------------------------------------------
|
||||
html_theme = 'sphinxawesome_theme'
|
||||
html_static_path = ['assets']
|
||||
|
||||
html_sidebars = {
|
||||
'**': [
|
||||
'logo.html',
|
||||
'structure.html', # Include your custom template
|
||||
]
|
||||
}
|
||||
|
||||
cymais_logo = "assets/img/logo.png"
|
||||
html_favicon = "assets/img/favicon.ico"
|
||||
|
||||
html_theme_options = {
|
||||
"show_prev_next": False,
|
||||
"logo_light": cymais_logo,
|
||||
"logo_dark": cymais_logo,
|
||||
}
|
||||
|
||||
source_suffix = {
|
||||
'.md': 'markdown',
|
||||
'.rst': 'restructuredtext',
|
||||
'.yml': 'restructuredtext',
|
||||
'.yaml': 'restructuredtext',
|
||||
}
|
||||
|
||||
sys.path.insert(0, os.path.abspath('./extensions'))
|
||||
|
||||
extensions = [
|
||||
#'sphinx.ext.autosummary',
|
||||
'myst_parser',
|
||||
'extensions.local_file_headings',
|
||||
'extensions.local_subfolders',
|
||||
'extensions.roles_overview',
|
||||
'extensions.markdown_include',
|
||||
'sphinx.ext.autodoc',
|
||||
'sphinx.ext.napoleon',
|
||||
]
|
||||
|
||||
autosummary_generate = True
|
||||
|
||||
myst_enable_extensions = [
|
||||
"colon_fence",
|
||||
]
|
||||
|
||||
import logging
|
||||
from docutils import nodes
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def replace_assets_in_doctree(app, doctree, docname):
|
||||
# Replace asset references in image nodes
|
||||
for node in doctree.traverse(nodes.image):
|
||||
if "assets/" in node['uri']:
|
||||
new_uri = node['uri'].replace("assets/", "_static/")
|
||||
node['uri'] = new_uri
|
||||
logger.info("Replaced image URI in {}: {}".format(docname, new_uri))
|
||||
|
||||
# Replace asset references in raw HTML nodes
|
||||
for node in doctree.traverse(nodes.raw):
|
||||
if node.get('format') == 'html' and "assets/" in node.astext():
|
||||
new_text = node.astext().replace("assets/", "_static/")
|
||||
node.children = [nodes.raw('', new_text, format='html')]
|
||||
logger.info("Replaced raw HTML assets in {}.".format(docname))
|
||||
|
||||
def setup(app):
|
||||
app.connect("doctree-resolved", replace_assets_in_doctree)
|
||||
|
||||
python_domain = app.registry.domains.get('py')
|
||||
if python_domain is not None:
|
||||
directive = python_domain.directives.get('currentmodule')
|
||||
if directive is not None:
|
||||
directive.optional_arguments = 10
|
||||
return {'version': '1.0', 'parallel_read_safe': True}
|
0
docs/extensions/__init__.py
Normal file
0
docs/extensions/__init__.py
Normal file
61
docs/extensions/local_file_headings.py
Normal file
61
docs/extensions/local_file_headings.py
Normal file
@ -0,0 +1,61 @@
|
||||
import os
|
||||
import sys
|
||||
import logging as std_logging # Use the standard logging module
|
||||
from sphinx.util import logging # Sphinx logging is used elsewhere if needed
|
||||
from docutils.parsers.rst import Directive
|
||||
from .nav_utils import natural_sort_key, extract_headings_from_file, group_headings, sort_tree, MAX_HEADING_LEVEL, DEFAULT_MAX_NAV_DEPTH
|
||||
|
||||
# Set up our logger based on command-line args.
|
||||
logger = std_logging.getLogger(__name__)
|
||||
if any(arg in sys.argv for arg in ["-v", "--verbose"]):
|
||||
logger.setLevel(std_logging.DEBUG)
|
||||
else:
|
||||
logger.setLevel(std_logging.INFO)
|
||||
|
||||
DEFAULT_MAX_NAV_DEPTH = 4
|
||||
|
||||
def add_local_file_headings(app, pagename, templatename, context, doctree):
|
||||
logger.debug("add_local_file_headings called with pagename: %s", pagename)
|
||||
|
||||
srcdir = app.srcdir
|
||||
directory = os.path.dirname(pagename)
|
||||
abs_dir = os.path.join(srcdir, directory)
|
||||
if not os.path.isdir(abs_dir):
|
||||
logger.warning("Directory %s not found for page %s.", abs_dir, pagename)
|
||||
context['local_md_headings'] = []
|
||||
return
|
||||
|
||||
# Get only files with .md or .rst extensions.
|
||||
files = [f for f in os.listdir(abs_dir) if f.endswith('.md') or f.endswith('.rst')]
|
||||
# If an index file is present, remove any readme files (case-insensitive).
|
||||
files_lower = [f.lower() for f in files]
|
||||
if 'index.rst' in files_lower:
|
||||
files = [f for f in files if f.lower() not in ['readme.md']]
|
||||
|
||||
file_items = []
|
||||
for file in files:
|
||||
filepath = os.path.join(abs_dir, file)
|
||||
headings = extract_headings_from_file(filepath, max_level=MAX_HEADING_LEVEL)
|
||||
basename, _ = os.path.splitext(file)
|
||||
# Set priority: index gets priority 0, otherwise 1.
|
||||
priority = 0 if basename.lower() == 'index' else 1
|
||||
for heading in headings:
|
||||
file_link = os.path.join(directory, basename)
|
||||
file_items.append({
|
||||
'level': heading['level'],
|
||||
'text': heading['text'],
|
||||
'link': file_link,
|
||||
'anchor': heading['anchor'],
|
||||
'priority': priority,
|
||||
'filename': basename
|
||||
})
|
||||
tree = group_headings(file_items)
|
||||
sort_tree(tree)
|
||||
|
||||
logger.debug("Generated tree: %s", tree)
|
||||
context['local_md_headings'] = tree
|
||||
|
||||
def setup(app):
|
||||
app.add_config_value('local_nav_max_depth', DEFAULT_MAX_NAV_DEPTH, 'env')
|
||||
app.connect('html-page-context', add_local_file_headings)
|
||||
return {'version': '0.1', 'parallel_read_safe': True}
|
130
docs/extensions/local_subfolders.py
Normal file
130
docs/extensions/local_subfolders.py
Normal file
@ -0,0 +1,130 @@
|
||||
import os
|
||||
from sphinx.util import logging
|
||||
from .nav_utils import extract_headings_from_file, MAX_HEADING_LEVEL
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
CANDIDATES = ['index.rst', 'readme.md', 'main.rst']
|
||||
|
||||
def collect_folder_tree(dir_path, base_url):
|
||||
"""
|
||||
Recursively collects the folder tree starting from the given directory.
|
||||
|
||||
For each folder:
|
||||
- Hidden folders (names starting with a dot) are skipped.
|
||||
- A folder is processed only if it contains one of the representative files:
|
||||
index.rst, index.md, readme.md, or readme.rst.
|
||||
- The first heading of the representative file is used as the folder title.
|
||||
- The representative file is not listed as a file in the folder.
|
||||
- All other Markdown and reStructuredText files are listed without sub-headings,
|
||||
using their first heading as the file title.
|
||||
"""
|
||||
# Skip hidden directories
|
||||
if os.path.basename(dir_path).startswith('.'):
|
||||
return None
|
||||
|
||||
# List all files in the current directory with .md or .rst extension
|
||||
files = [f for f in os.listdir(dir_path)
|
||||
if os.path.isfile(os.path.join(dir_path, f))
|
||||
and (f.endswith('.md') or f.endswith('.rst'))]
|
||||
|
||||
# Find representative file for folder title using index or readme
|
||||
rep_file = None
|
||||
for candidate in CANDIDATES:
|
||||
for f in files:
|
||||
if f.lower() == candidate:
|
||||
rep_file = f
|
||||
break
|
||||
if rep_file:
|
||||
break
|
||||
|
||||
# Skip this folder if no representative file exists
|
||||
if not rep_file:
|
||||
return None
|
||||
|
||||
rep_path = os.path.join(dir_path, rep_file)
|
||||
headings = extract_headings_from_file(rep_path, max_level=MAX_HEADING_LEVEL)
|
||||
folder_title = headings[0]['text'] if headings else os.path.basename(dir_path)
|
||||
folder_link = os.path.join(base_url, os.path.splitext(rep_file)[0])
|
||||
|
||||
# Remove the representative file from the list to avoid duplication,
|
||||
# and filter out any additional "readme.md" or "index.rst" files.
|
||||
files.remove(rep_file)
|
||||
files = [f for f in files if f.lower() not in CANDIDATES]
|
||||
|
||||
# Process the remaining files in the current directory
|
||||
file_items = []
|
||||
for file in sorted(files, key=lambda s: s.lower()):
|
||||
file_path = os.path.join(dir_path, file)
|
||||
file_headings = extract_headings_from_file(file_path, max_level=MAX_HEADING_LEVEL)
|
||||
file_title = file_headings[0]['text'] if file_headings else file
|
||||
file_base = os.path.splitext(file)[0]
|
||||
file_link = os.path.join(base_url, file_base)
|
||||
file_items.append({
|
||||
'level': 1,
|
||||
'text': file_title,
|
||||
'link': file_link,
|
||||
'anchor': '',
|
||||
'priority': 1,
|
||||
'filename': file
|
||||
})
|
||||
|
||||
# Process subdirectories (ignoring hidden ones)
|
||||
dir_items = []
|
||||
for item in sorted(os.listdir(dir_path), key=lambda s: s.lower()):
|
||||
full_path = os.path.join(dir_path, item)
|
||||
if os.path.isdir(full_path) and not item.startswith('.'):
|
||||
subtree = collect_folder_tree(full_path, os.path.join(base_url, item))
|
||||
if subtree:
|
||||
dir_items.append(subtree)
|
||||
|
||||
# Combine files and subdirectories as children of the current folder
|
||||
children = file_items + dir_items
|
||||
|
||||
return {
|
||||
'text': folder_title,
|
||||
'link': folder_link,
|
||||
'children': children,
|
||||
'filename': os.path.basename(dir_path)
|
||||
}
|
||||
|
||||
def mark_current(node, active):
|
||||
"""
|
||||
Recursively mark nodes as current if the active page (pagename)
|
||||
matches the node's link or is a descendant of it.
|
||||
|
||||
The function sets node['current'] = True if:
|
||||
- The node's link matches the active page exactly, or
|
||||
- The active page begins with the node's link plus a separator (indicating a child).
|
||||
Additionally, if any child node is current, the parent is marked as current.
|
||||
"""
|
||||
is_current = False
|
||||
node_link = node.get('link', '').rstrip('/')
|
||||
active = active.rstrip('/')
|
||||
if node_link and (active == node_link or active.startswith(node_link + '/')):
|
||||
is_current = True
|
||||
|
||||
# Recurse into children if they exist
|
||||
children = node.get('children', [])
|
||||
for child in children:
|
||||
if mark_current(child, active):
|
||||
is_current = True
|
||||
|
||||
node['current'] = is_current
|
||||
return is_current
|
||||
|
||||
def add_local_subfolders(app, pagename, templatename, context, doctree):
|
||||
"""
|
||||
Sets the 'local_subfolders' context variable with the entire folder tree
|
||||
starting from app.srcdir, and marks the tree with the 'current' flag up
|
||||
to the active page.
|
||||
"""
|
||||
root_dir = app.srcdir
|
||||
folder_tree = collect_folder_tree(root_dir, '')
|
||||
if folder_tree:
|
||||
mark_current(folder_tree, pagename)
|
||||
context['local_subfolders'] = [folder_tree] if folder_tree else []
|
||||
|
||||
def setup(app):
|
||||
app.connect('html-page-context', add_local_subfolders)
|
||||
return {'version': '0.1', 'parallel_read_safe': True}
|
80
docs/extensions/markdown_include.py
Normal file
80
docs/extensions/markdown_include.py
Normal file
@ -0,0 +1,80 @@
|
||||
import os
|
||||
from docutils import nodes
|
||||
from docutils.parsers.rst import Directive
|
||||
from sphinx.util import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from myst_parser.parsers.sphinx_ import MystParser
|
||||
|
||||
class MarkdownIncludeDirective(Directive):
|
||||
required_arguments = 1 # Path to the Markdown file
|
||||
optional_arguments = 0
|
||||
final_argument_whitespace = True
|
||||
has_content = False
|
||||
|
||||
def run(self):
|
||||
logger.info("Executing markdown-include directive")
|
||||
env = self.state.document.settings.env
|
||||
# Determine the absolute path of the file.
|
||||
rel_filename, filename = env.relfn2path(self.arguments[0])
|
||||
logger.info("Markdown file: %s", filename)
|
||||
if not os.path.exists(filename):
|
||||
error = self.state_machine.reporter.error(
|
||||
f'File not found: {filename}',
|
||||
nodes.literal_block(self.block_text, self.block_text),
|
||||
line=self.lineno)
|
||||
return [error]
|
||||
|
||||
try:
|
||||
with open(filename, 'r', encoding='utf-8') as f:
|
||||
markdown_content = f.read()
|
||||
except Exception as e:
|
||||
error = self.state_machine.reporter.error(
|
||||
f'Error reading file {filename}: {e}',
|
||||
nodes.literal_block(self.block_text, self.block_text),
|
||||
line=self.lineno)
|
||||
return [error]
|
||||
|
||||
# Parse the Markdown content with MystParser.
|
||||
parser = MystParser()
|
||||
from docutils.frontend import OptionParser
|
||||
from docutils.utils import new_document
|
||||
settings = OptionParser(components=(MystParser,)).get_default_values()
|
||||
# Attach the Sphinx environment to the settings so that myst_parser works.
|
||||
settings.env = self.state.document.settings.env
|
||||
doc = new_document(filename, settings=settings)
|
||||
parser.parse(markdown_content, doc)
|
||||
logger.info("Markdown parsing completed successfully")
|
||||
|
||||
# Remove the first header (title) if it exists.
|
||||
if doc.children:
|
||||
first_section = doc.children[0]
|
||||
if isinstance(first_section, nodes.section) and first_section.children:
|
||||
first_child = first_section.children[0]
|
||||
if isinstance(first_child, nodes.title):
|
||||
# If there are additional children, remove the title node.
|
||||
if len(first_section.children) > 1:
|
||||
first_section.pop(0)
|
||||
logger.info("Removed first header from Markdown content")
|
||||
else:
|
||||
# If it's the only child, clear its content instead.
|
||||
first_child.clear()
|
||||
logger.info("Cleared text of first header from Markdown content")
|
||||
|
||||
# Unwrap the first section if it no longer has a title.
|
||||
if isinstance(first_section, nodes.section):
|
||||
has_title = any(isinstance(child, nodes.title) and child.astext().strip()
|
||||
for child in first_section.children)
|
||||
if not has_title:
|
||||
# Remove the section wrapper so that its content does not create a TOC entry.
|
||||
unwrapped = list(first_section.children)
|
||||
# Replace the first section with its children.
|
||||
doc.children = unwrapped + doc.children[1:]
|
||||
logger.info("Unwrapped first section to avoid a TOC entry")
|
||||
|
||||
return doc.children
|
||||
|
||||
def setup(app):
|
||||
app.add_directive("markdown-include", MarkdownIncludeDirective)
|
||||
return {'version': '0.1', 'parallel_read_safe': True}
|
78
docs/extensions/nav_utils.py
Normal file
78
docs/extensions/nav_utils.py
Normal file
@ -0,0 +1,78 @@
|
||||
import os
|
||||
import re
|
||||
import yaml
|
||||
|
||||
DEFAULT_MAX_NAV_DEPTH = 4
|
||||
MAX_HEADING_LEVEL = 0 # This can be overridden in your configuration
|
||||
|
||||
def natural_sort_key(text):
|
||||
return [int(c) if c.isdigit() else c.lower() for c in re.split(r'(\d+)', text)]
|
||||
|
||||
def extract_headings_from_file(filepath, max_level=MAX_HEADING_LEVEL):
|
||||
# If max_level is 0, set it to a very high value to effectively iterate infinitely
|
||||
if max_level == 0:
|
||||
max_level = 9999
|
||||
|
||||
headings = []
|
||||
ext = os.path.splitext(filepath)[1].lower()
|
||||
try:
|
||||
with open(filepath, 'r', encoding='utf-8') as f:
|
||||
if ext == '.md':
|
||||
in_code_block = False
|
||||
for line in f:
|
||||
if line.strip().startswith("```"):
|
||||
in_code_block = not in_code_block
|
||||
continue
|
||||
if in_code_block:
|
||||
continue
|
||||
# Assuming markdown headings are defined with '#' characters
|
||||
match = re.match(r'^(#{1,})(.*?)$', line)
|
||||
if match:
|
||||
level = len(match.group(1))
|
||||
if level <= max_level:
|
||||
heading_text = match.group(2).strip()
|
||||
anchor = re.sub(r'\s+', '-', heading_text.lower())
|
||||
anchor = re.sub(r'[^a-z0-9\-]', '', anchor)
|
||||
headings.append({'level': level, 'text': heading_text, 'anchor': anchor})
|
||||
elif ext == '.rst':
|
||||
lines = f.readlines()
|
||||
for i in range(len(lines) - 1):
|
||||
text_line = lines[i].rstrip("\n")
|
||||
underline = lines[i+1].rstrip("\n")
|
||||
if len(underline) >= 3 and re.fullmatch(r'[-=~\^\+"\'`]+', underline):
|
||||
level = 1
|
||||
heading_text = text_line.strip()
|
||||
headings.append({'level': level, 'text': heading_text, 'anchor': ''})
|
||||
except Exception as e:
|
||||
print(f"Warning: Error reading {filepath}: {e}")
|
||||
if not headings:
|
||||
base = os.path.basename(filepath).lower()
|
||||
if base == 'index.rst':
|
||||
folder = os.path.dirname(filepath)
|
||||
readme_path = os.path.join(folder, 'README.md')
|
||||
if os.path.isfile(readme_path):
|
||||
try:
|
||||
headings = extract_headings_from_file(readme_path, max_level)
|
||||
except Exception as e:
|
||||
print(f"Warning: Error reading fallback README.md in {folder}: {e}")
|
||||
return headings
|
||||
|
||||
def group_headings(headings):
|
||||
tree = []
|
||||
stack = []
|
||||
for heading in headings:
|
||||
heading['children'] = []
|
||||
while stack and stack[-1]['level'] >= heading['level']:
|
||||
stack.pop()
|
||||
if stack:
|
||||
stack[-1]['children'].append(heading)
|
||||
else:
|
||||
tree.append(heading)
|
||||
stack.append(heading)
|
||||
return tree
|
||||
|
||||
def sort_tree(tree):
|
||||
tree.sort(key=lambda x: (x.get('priority', 1), natural_sort_key(x.get('filename', x['text']))))
|
||||
for item in tree:
|
||||
if item.get('children'):
|
||||
sort_tree(item['children'])
|
116
docs/extensions/roles_overview.py
Normal file
116
docs/extensions/roles_overview.py
Normal file
@ -0,0 +1,116 @@
|
||||
import os
|
||||
import glob
|
||||
import re
|
||||
import yaml
|
||||
from docutils import nodes
|
||||
from sphinx.util import logging
|
||||
from docutils.parsers.rst import Directive
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class RolesOverviewDirective(Directive):
|
||||
"""
|
||||
A directive to embed a roles overview as reStructuredText.
|
||||
|
||||
It scans the roles directory (i.e. every folder under "roles") for a "meta/main.yml" file,
|
||||
reads the role’s galaxy tags and description, and outputs an overview grouped by each tag.
|
||||
For each role, it attempts to extract a level‑1 heading from its README.md as the title.
|
||||
If no title is found, the role folder name is used.
|
||||
The title is rendered as a clickable link to the role's README.md.
|
||||
"""
|
||||
has_content = False
|
||||
|
||||
def run(self):
|
||||
env = self.state.document.settings.env
|
||||
srcdir = env.srcdir
|
||||
roles_dir = os.path.join(srcdir, 'roles')
|
||||
if not os.path.isdir(roles_dir):
|
||||
logger.warning(f"Roles directory not found: {roles_dir}")
|
||||
error_node = self.state.document.reporter.error(
|
||||
"Roles directory not found.", line=self.lineno)
|
||||
return [error_node]
|
||||
|
||||
# Gather role entries grouped by tag.
|
||||
categories = {}
|
||||
for role_path in glob.glob(os.path.join(roles_dir, '*')):
|
||||
if os.path.isdir(role_path):
|
||||
meta_path = os.path.join(role_path, 'meta', 'main.yml')
|
||||
if os.path.exists(meta_path):
|
||||
try:
|
||||
with open(meta_path, 'r', encoding='utf-8') as f:
|
||||
data = yaml.safe_load(f)
|
||||
except Exception as e:
|
||||
logger.warning(f"Error reading YAML file {meta_path}: {e}")
|
||||
continue
|
||||
|
||||
role_name = os.path.basename(role_path)
|
||||
# Determine title from README.md if present.
|
||||
readme_path = os.path.join(role_path, 'README.md')
|
||||
title = role_name
|
||||
if os.path.exists(readme_path):
|
||||
try:
|
||||
with open(readme_path, 'r', encoding='utf-8') as f:
|
||||
for line in f:
|
||||
match = re.match(r'^#\s+(.*)$', line)
|
||||
if match:
|
||||
title = match.group(1).strip()
|
||||
break
|
||||
except Exception as e:
|
||||
logger.warning(f"Error reading README.md for {role_name}: {e}")
|
||||
|
||||
galaxy_info = data.get('galaxy_info', {})
|
||||
tags = galaxy_info.get('galaxy_tags', [])
|
||||
if not tags:
|
||||
tags = ['uncategorized']
|
||||
role_description = galaxy_info.get('description', '')
|
||||
role_entry = {
|
||||
'name': role_name,
|
||||
'title': title,
|
||||
'description': role_description,
|
||||
'link': f'roles/{role_name}/README.md',
|
||||
'tags': tags,
|
||||
}
|
||||
for tag in tags:
|
||||
categories.setdefault(tag, []).append(role_entry)
|
||||
else:
|
||||
logger.warning(f"meta/main.yml not found for role {role_path}")
|
||||
|
||||
# Sort categories and roles alphabetically.
|
||||
sorted_categories = sorted(categories.items(), key=lambda x: x[0].lower())
|
||||
for tag, roles in sorted_categories:
|
||||
roles.sort(key=lambda r: r['name'].lower())
|
||||
|
||||
# Build document structure.
|
||||
container = nodes.container()
|
||||
|
||||
# For each category, create a section to serve as a large category heading.
|
||||
for tag, roles in sorted_categories:
|
||||
# Create a section for the category.
|
||||
cat_id = nodes.make_id(tag)
|
||||
category_section = nodes.section(ids=[cat_id])
|
||||
category_title = nodes.title(text=tag)
|
||||
category_section += category_title
|
||||
|
||||
# For each role within the category, create a subsection.
|
||||
for role in roles:
|
||||
role_section_id = nodes.make_id(role['title'])
|
||||
role_section = nodes.section(ids=[role_section_id])
|
||||
# Create a title node with a clickable reference.
|
||||
role_title = nodes.title()
|
||||
reference = nodes.reference(text=role['title'], refuri=role['link'])
|
||||
role_title += reference
|
||||
role_section += role_title
|
||||
|
||||
if role['description']:
|
||||
para = nodes.paragraph(text=role['description'])
|
||||
role_section += para
|
||||
|
||||
category_section += role_section
|
||||
|
||||
container += category_section
|
||||
|
||||
return [container]
|
||||
|
||||
def setup(app):
|
||||
app.add_directive("roles-overview", RolesOverviewDirective)
|
||||
return {'version': '0.1', 'parallel_read_safe': True}
|
0
docs/generated/.gitkeep
Normal file
0
docs/generated/.gitkeep
Normal file
67
docs/generators/ansible_roles.py
Normal file
67
docs/generators/ansible_roles.py
Normal file
@ -0,0 +1,67 @@
|
||||
import os
|
||||
import yaml
|
||||
import argparse
|
||||
import subprocess
|
||||
|
||||
def convert_md_to_rst(md_content):
|
||||
"""Convert Markdown content to reStructuredText using Pandoc."""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["pandoc", "-f", "markdown", "-t", "rst"],
|
||||
input=md_content.encode("utf-8"),
|
||||
capture_output=True,
|
||||
check=True
|
||||
)
|
||||
return result.stdout.decode("utf-8")
|
||||
except subprocess.CalledProcessError as e:
|
||||
print("Error converting Markdown to reStructuredText:", e)
|
||||
return md_content
|
||||
|
||||
def generate_ansible_roles_doc(roles_dir, output_dir):
|
||||
"""Generates reStructuredText documentation for Ansible roles."""
|
||||
if not os.path.exists(output_dir):
|
||||
os.makedirs(output_dir)
|
||||
|
||||
for role in os.listdir(roles_dir):
|
||||
role_path = os.path.join(roles_dir, role)
|
||||
meta_file = os.path.join(role_path, "meta/main.yml")
|
||||
readme_file = os.path.join(role_path, "README.md")
|
||||
|
||||
if os.path.exists(meta_file):
|
||||
with open(meta_file, "r", encoding="utf-8") as f:
|
||||
meta_data = yaml.safe_load(f)
|
||||
|
||||
role_doc = os.path.join(output_dir, f"{role}.rst")
|
||||
with open(role_doc, "w", encoding="utf-8") as f:
|
||||
# Hauptüberschrift
|
||||
f.write(f"{role.capitalize()} Role\n")
|
||||
f.write("=" * (len(role) + 7) + "\n\n")
|
||||
|
||||
f.write(f"**Description:** {meta_data.get('description', 'No description available')}\n\n")
|
||||
|
||||
# Unterüberschrift für Variablen
|
||||
f.write("Variables\n")
|
||||
f.write("---------\n\n")
|
||||
|
||||
for key, value in meta_data.get('galaxy_info', {}).items():
|
||||
f.write(f"- **{key}**: {value}\n")
|
||||
|
||||
# README falls vorhanden konvertieren und einfügen
|
||||
if os.path.exists(readme_file):
|
||||
f.write("\nREADME\n")
|
||||
f.write("------\n\n")
|
||||
with open(readme_file, "r", encoding="utf-8") as readme:
|
||||
markdown_content = readme.read()
|
||||
rst_content = convert_md_to_rst(markdown_content)
|
||||
f.write(rst_content)
|
||||
|
||||
print(f"Ansible roles documentation has been generated in {output_dir}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Generate documentation for Ansible roles.")
|
||||
parser.add_argument("--roles-dir", required=True, help="Directory containing Ansible roles.")
|
||||
parser.add_argument("--output-dir", required=True, help="Directory where documentation will be saved.")
|
||||
|
||||
args = parser.parse_args()
|
||||
generate_ansible_roles_doc(args.roles_dir, args.output_dir)
|
||||
|
40
docs/generators/index.py
Normal file
40
docs/generators/index.py
Normal file
@ -0,0 +1,40 @@
|
||||
import os
|
||||
import argparse
|
||||
|
||||
def generate_ansible_roles_index(roles_dir, output_file, caption: str):
|
||||
"""Generates an index.rst file listing all .rst files in the given directory."""
|
||||
|
||||
roles_dir = os.path.abspath(roles_dir)
|
||||
output_file = os.path.abspath(output_file)
|
||||
output_dir = os.path.dirname(output_file)
|
||||
|
||||
if not os.path.exists(roles_dir):
|
||||
print(f"Error: Directory {roles_dir} does not exist.")
|
||||
return
|
||||
|
||||
os.makedirs(output_dir, exist_ok=True)
|
||||
|
||||
rst_files = [f for f in os.listdir(roles_dir) if f.endswith(".rst")]
|
||||
rst_files.sort() # Alphabetisch sortieren
|
||||
|
||||
# Berechne relative Pfade zur korrekten Verlinkung
|
||||
rel_paths = [os.path.relpath(os.path.join(roles_dir, f), start=output_dir) for f in rst_files]
|
||||
|
||||
with open(output_file, "w", encoding="utf-8") as f:
|
||||
f.write(f"{caption}\n===================\n\n")
|
||||
f.write(f".. toctree::\n :maxdepth: 1\n :caption: {caption}\n\n")
|
||||
|
||||
for rel_path in rel_paths:
|
||||
file_name_without_ext = os.path.splitext(rel_path)[0]
|
||||
f.write(f" {file_name_without_ext}\n")
|
||||
|
||||
print(f"Index generated at {output_file}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Generate an index for documentation.")
|
||||
parser.add_argument("--roles-dir", required=True, help="Directory containing .rst files.")
|
||||
parser.add_argument("--output-file", required=True, help="Path to the output index.rst file.")
|
||||
parser.add_argument("--caption", required=True, help="The index title")
|
||||
|
||||
args = parser.parse_args()
|
||||
generate_ansible_roles_index(args.roles_dir, args.output_file, args.caption)
|
37
docs/generators/readmes.py
Normal file
37
docs/generators/readmes.py
Normal file
@ -0,0 +1,37 @@
|
||||
import os
|
||||
import argparse
|
||||
|
||||
def create_readme_in_subdirs(generated_dir):
|
||||
"""
|
||||
Creates a README.md file in each subdirectory of generated_dir.
|
||||
The README will contain a title based on the subdirectory name.
|
||||
"""
|
||||
generated_dir = os.path.abspath(generated_dir)
|
||||
|
||||
if not os.path.exists(generated_dir):
|
||||
print(f"Error: Directory {generated_dir} does not exist.")
|
||||
return
|
||||
|
||||
for root, dirs, _ in os.walk(generated_dir):
|
||||
for subdir in dirs:
|
||||
subdir_path = os.path.join(root, subdir)
|
||||
readme_path = os.path.join(subdir_path, "README.md")
|
||||
|
||||
folder_base_name = os.path.basename(subdir)
|
||||
|
||||
readme_content = f"""\
|
||||
# Auto Generated Technical Documentation: {folder_base_name}
|
||||
|
||||
This folder contains an auto-generated technical role documentation for CyMaIS.
|
||||
"""
|
||||
|
||||
with open(readme_path, "w", encoding="utf-8") as f:
|
||||
f.write(readme_content)
|
||||
print(f"README.md created at {readme_path}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Create README.md files in all subdirectories of the given directory.")
|
||||
parser.add_argument("--generated-dir", required=True, help="Path to the generated directory.")
|
||||
|
||||
args = parser.parse_args()
|
||||
create_readme_in_subdirs(args.generated_dir)
|
51
docs/generators/yaml_index.py
Normal file
51
docs/generators/yaml_index.py
Normal file
@ -0,0 +1,51 @@
|
||||
import os
|
||||
import argparse
|
||||
import pathspec
|
||||
|
||||
def load_gitignore_patterns(source_dir):
|
||||
"""Loads .gitignore patterns from the given source directory and returns a PathSpec object."""
|
||||
gitignore_path = os.path.join(source_dir, ".gitignore")
|
||||
if not os.path.exists(gitignore_path):
|
||||
return pathspec.PathSpec.from_lines("gitwildmatch", [])
|
||||
|
||||
with open(gitignore_path, "r", encoding="utf-8") as f:
|
||||
patterns = f.readlines()
|
||||
|
||||
return pathspec.PathSpec.from_lines("gitwildmatch", patterns)
|
||||
|
||||
def generate_yaml_index(source_dir, output_file):
|
||||
"""Generates an index file listing all YAML files in the specified directory while respecting .gitignore rules."""
|
||||
|
||||
yaml_files = []
|
||||
spec = load_gitignore_patterns(source_dir) # Load .gitignore rules
|
||||
|
||||
# Walk through the source directory and collect YAML files
|
||||
for root, _, files in os.walk(source_dir):
|
||||
for file in files:
|
||||
file_path = os.path.relpath(os.path.join(root, file), start=source_dir)
|
||||
|
||||
if file.endswith(('.yml', '.yaml')) and not spec.match_file(file_path):
|
||||
yaml_files.append(os.path.join(root, file))
|
||||
|
||||
# Create the output directory if it doesn't exist
|
||||
os.makedirs(os.path.dirname(output_file), exist_ok=True)
|
||||
|
||||
# Write the YAML index to the output file
|
||||
with open(output_file, "w", encoding="utf-8") as f:
|
||||
f.write("YAML Files\n===========\n\n")
|
||||
f.write("This document lists all `.yaml` and `.yml` files found in the specified directory, excluding ignored files.\n\n")
|
||||
|
||||
for file in sorted(yaml_files):
|
||||
relative_file_path = os.path.relpath(file, start=os.path.dirname(output_file))
|
||||
f.write(f".. literalinclude:: {relative_file_path}\n :language: yaml\n :linenos:\n\n")
|
||||
|
||||
|
||||
print(f"YAML index has been generated at {output_file}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Generate an index for YAML files while respecting .gitignore.")
|
||||
parser.add_argument("--source-dir", required=True, help="Directory containing YAML files.")
|
||||
parser.add_argument("--output-file", required=True, help="Path to the output .rst file.")
|
||||
|
||||
args = parser.parse_args()
|
||||
generate_yaml_index(args.source_dir, args.output_file)
|
@ -1,23 +0,0 @@
|
||||
# Security Guidelines
|
||||
|
||||
CyMaIS is designed with security in mind. However, while following our guidelines can greatly improve your system’s security, no IT system can be 100% secure. Please report any vulnerabilities as soon as possible.
|
||||
|
||||
For optimal personal security, we **strongly recommend** the following:
|
||||
|
||||
- **Use a Password Manager**
|
||||
Use a reliable password manager such as [KeePass](https://keepass.info/) 🔐. (Learn more about [password managers](https://en.wikipedia.org/wiki/Password_manager) on Wikipedia.) KeePass is available for both smartphones and PCs, and it can automatically generate strong, random passwords.
|
||||
|
||||
- **Enable Two-Factor Authentication (2FA)**
|
||||
Always enable 2FA whenever possible. Many password managers (like KeePass) can generate [TOTP](https://en.wikipedia.org/wiki/Time-based_One-Time_Password) tokens, adding an extra layer of security even if your password is compromised.
|
||||
Synchronize your password database across devices using the [Nextcloud Client](https://nextcloud.com/) 📱💻.
|
||||
|
||||
- **Use Encrypted Systems**
|
||||
We recommend running CyMaIS only on systems with full disk encryption. For example, Linux distributions such as [Manjaro](https://manjaro.org/) (based on ArchLinux) with desktop environments like [GNOME](https://en.wikipedia.org/wiki/GNOME) provide excellent security. (Learn more about [disk encryption](https://en.wikipedia.org/wiki/Disk_encryption) on Wikipedia.)
|
||||
|
||||
- **Beware of Phishing and Social Engineering**
|
||||
Always verify email senders, avoid clicking on unknown links, and never share your passwords or 2FA codes with anyone. (Learn more about [Phishing](https://en.wikipedia.org/wiki/Phishing) and [Social Engineering](https://en.wikipedia.org/wiki/Social_engineering_(security)) on Wikipedia.)
|
||||
|
||||
Following these guidelines will significantly enhance your personal security—but remember, no system is completely immune to risk.
|
||||
|
||||
A tutorial how to setup secure password management you will find [here](https://blog.veen.world/blog/2025/04/04/%f0%9f%9b%a1%ef%b8%8f-keepassxc-cymais-cloud-the-ultimate-guide-to-cross-device-password-security/)
|
||||
---
|
0
docs/output/.gitkeep
Normal file
0
docs/output/.gitkeep
Normal file
13
docs/requirements.yml
Normal file
13
docs/requirements.yml
Normal file
@ -0,0 +1,13 @@
|
||||
apt:
|
||||
make
|
||||
curl
|
||||
pandoc
|
||||
pip:
|
||||
myst-parser
|
||||
sphinx
|
||||
sphinxawesome-theme
|
||||
docutils
|
||||
sphinx-jinja
|
||||
sphinxcontrib-yaml
|
||||
pathspec
|
||||
markdown2
|
0
docs/requirements/.gitkeep
Normal file
0
docs/requirements/.gitkeep
Normal file
38
docs/scripts/extract-requirements.sh
Normal file
38
docs/scripts/extract-requirements.sh
Normal file
@ -0,0 +1,38 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Check if correct number of arguments is given
|
||||
if [[ $# -ne 3 ]]; then
|
||||
echo "Usage: $0 <input_file> <apt_output_file> <pip_output_file>"
|
||||
echo "Input: $0 <$1> <$2> <$3>"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
input_file="$1"
|
||||
apt_file="$2"
|
||||
pip_file="$3"
|
||||
|
||||
# Clear the output files
|
||||
> "$apt_file"
|
||||
> "$pip_file"
|
||||
|
||||
current_section=""
|
||||
|
||||
while IFS= read -r line; do
|
||||
[[ -z "$line" ]] && continue
|
||||
|
||||
if [[ "$line" == apt:* ]]; then
|
||||
current_section="apt"
|
||||
continue
|
||||
elif [[ "$line" == pip:* ]]; then
|
||||
current_section="pip"
|
||||
continue
|
||||
fi
|
||||
|
||||
package=$(echo "$line" | sed 's/^[[:space:]]*//')
|
||||
|
||||
if [[ "$current_section" == "apt" ]]; then
|
||||
echo "$package" >> "$apt_file"
|
||||
elif [[ "$current_section" == "pip" ]]; then
|
||||
echo "$package" >> "$pip_file"
|
||||
fi
|
||||
done < "$input_file"
|
5
docs/templates/logo.html
vendored
Normal file
5
docs/templates/logo.html
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
<div class="sidebar-logo" style="text-align: center; margin-bottom: 1em;">
|
||||
<img src="{{ pathto("_static/img/logo.png", 1) }}" alt="Logo" style="max-width: 100%;">
|
||||
</div>
|
||||
|
||||
|
65
docs/templates/structure.html
vendored
Normal file
65
docs/templates/structure.html
vendored
Normal file
@ -0,0 +1,65 @@
|
||||
{% macro render_headings(headings, level=1) %}
|
||||
<ul class="toctree-l{{ level }}" style="list-style: none; padding-left: 0; overflow-x: auto; white-space: nowrap;">
|
||||
{% for item in headings %}
|
||||
<li class="toctree-l{{ level }}{% if item.current %} current{% endif %}"
|
||||
{% if item.children %}
|
||||
x-data="{ expanded: {{ 'true' if item.current else 'false' }} }"
|
||||
{% endif %}
|
||||
style="white-space: nowrap;">
|
||||
<div class="menu-item" style="display: inline-flex; align-items: center; justify-content: space-between; width: 100%; white-space: nowrap;">
|
||||
<!-- Link and file open section -->
|
||||
<div style="display: inline-flex; align-items: center; white-space: nowrap;">
|
||||
<a class="reference internal{% if item.children %} expandable{% endif %}{% if item.current and not item.children %} current{% endif %}"
|
||||
href="{{ pathto(item.link).replace('#', '') }}{% if item.anchor %}#{{ item.anchor }}{% endif %}"
|
||||
style="text-decoration: none; white-space: nowrap;">
|
||||
{{ item.text }}
|
||||
</a>
|
||||
</div>
|
||||
<!-- Expand-Toggle Button -->
|
||||
{% if item.children %}
|
||||
<button @click.prevent.stop="expanded = !expanded" type="button" class="toggle-button"
|
||||
style="background: none; border: none; padding: 0; margin-left: auto;">
|
||||
<span x-show="!expanded">
|
||||
<svg fill="currentColor" height="18px" stroke="none" viewBox="0 0 24 24" width="18px"
|
||||
xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M10 6L8.59 7.41 13.17 12l-4.58 4.59L10 18l6-6z"></path>
|
||||
</svg>
|
||||
</span>
|
||||
<span x-show="expanded">▼</span>
|
||||
</button>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if item.children %}
|
||||
<div x-show="expanded">
|
||||
{{ render_headings(item.children, level+1) }}
|
||||
</div>
|
||||
{% endif %}
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
{% endmacro %}
|
||||
|
||||
{% if local_md_headings or local_subfolders %}
|
||||
<div class="local-md-headings">
|
||||
{% if local_md_headings %}
|
||||
<div class="current-index" x-data x-init="typeof initCurrentNav === 'function' && initCurrentNav()">
|
||||
<p class="caption" role="heading">
|
||||
<span class="caption-text">Current Index</span>
|
||||
</p>
|
||||
{{ render_headings(local_md_headings) }}
|
||||
<br />
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if local_subfolders %}
|
||||
<div class="full-index">
|
||||
<p class="caption" role="heading">
|
||||
<span class="caption-text">Full Index</span>
|
||||
</p>
|
||||
{{ render_headings(local_subfolders) }}
|
||||
<br />
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
<script src="{{ pathto('_static/js/current-nav.js', 1) }}"></script>
|
||||
|
@ -714,6 +714,9 @@ defaults_applications:
|
||||
|
||||
## Sphinx
|
||||
sphinx:
|
||||
version: "3.9-slim" # Use latest docker image
|
||||
repository_sphinx_source: "https://github.com/kevinveenbirkenbach/cymais.git" # Repository address to pull the source repository from
|
||||
sphinx_exec_dir_relative: "docs/" # The relative path to the sphinx Makefile folder from the source dir
|
||||
matomo_tracking_enabled: "{{matomo_tracking_enabled_default}}" # Enables\Disables Matomo Tracking
|
||||
css_enabled: "{{css_enabled_default}}" # Enables\Disables Global CSS Style
|
||||
landingpage_iframe_enabled: true # Makes sense to make the documentary allways in iframe available
|
||||
|
@ -2,4 +2,5 @@ collections:
|
||||
- name: kewlfft.aur
|
||||
pacman:
|
||||
- ansible
|
||||
- python-passlib
|
||||
pip:
|
||||
- passlib
|
@ -1,4 +1,3 @@
|
||||
# Todos
|
||||
- Implement auto password hash
|
||||
- Implement auto memberof setup
|
||||
- Create a Dockerfile (may in an own repository) with memberOf
|
||||
- Implement auto memberof setup
|
@ -6,7 +6,7 @@ Sphinx is a powerful documentation generator originally created for Python proje
|
||||
|
||||
## Overview
|
||||
|
||||
This Docker Compose deployment leverages Ansible to automatically pull your source repository, build the documentation using Sphinx, and serve the generated HTML through a lightweight HTTP server. The entire process is containerized, which guarantees a consistent and isolated environment regardless of the host system. By default it uses [CyMaIS Sphinx](https://github.com/kevinveenbirkenbach/cymais-sphinx) to build the docs.
|
||||
This Docker Compose deployment leverages Ansible to automatically pull your source repository, build the documentation using Sphinx, and serve the generated HTML through a lightweight HTTP server. The entire process is containerized, which guarantees a consistent and isolated environment regardless of the host system.
|
||||
|
||||
## Purpose
|
||||
|
||||
|
@ -27,5 +27,4 @@ galaxy_info:
|
||||
documentation: "https://s.veen.world/cymais"
|
||||
logo:
|
||||
class: "fa-solid fa-book"
|
||||
dependencies:
|
||||
- package-manager
|
||||
dependencies: []
|
||||
|
@ -4,26 +4,35 @@
|
||||
include_role:
|
||||
name: docker-compose
|
||||
|
||||
- name: install cymais-sphinx
|
||||
command:
|
||||
cmd: "pkgmgr install cymais-sphinx --clone-mode https"
|
||||
notify: docker compose project build and setup
|
||||
- name: "Create {{ host_sphinx_source_dir_absolute }} directory"
|
||||
file:
|
||||
path: "{{ host_sphinx_source_dir_absolute }}"
|
||||
state: directory
|
||||
mode: '0755'
|
||||
|
||||
- name: update cymais for up to date docs
|
||||
command:
|
||||
cmd: "pkgmgr update cymais"
|
||||
notify: docker compose project build and setup
|
||||
|
||||
- name: Get path of cymais-sphinx using pkgmgr
|
||||
command: pkgmgr path cymais-sphinx
|
||||
register: path_cymais_sphinx_output
|
||||
- name: "pull the source repository to build the Sphinx documentation from {{ applications.sphinx.repository_sphinx_source }} to {{ host_sphinx_source_dir_absolute }}"
|
||||
git:
|
||||
repo: "{{ applications.sphinx.repository_sphinx_source }}"
|
||||
dest: "{{ host_sphinx_source_dir_absolute }}"
|
||||
update: yes
|
||||
clone: yes
|
||||
notify: docker compose project build and setup
|
||||
become: true
|
||||
|
||||
- name: "include role nginx-domain-setup for {{application_id}}"
|
||||
include_role:
|
||||
name: nginx-domain-setup
|
||||
vars:
|
||||
domain: "{{ domains[application_id] }}"
|
||||
domain: "{{ domains[application_id] }}"
|
||||
http_port: "{{ ports.localhost.http[application_id] }}"
|
||||
|
||||
- name: "create {{ sphinx_host_dockerfile }}"
|
||||
copy:
|
||||
src: "{{ sphinx_control_node_dockerfile }}"
|
||||
dest: "{{ sphinx_host_dockerfile }}"
|
||||
mode: '770'
|
||||
force: yes
|
||||
notify: docker compose project build and setup
|
||||
|
||||
- name: "copy docker-compose.yml and env file"
|
||||
include_tasks: copy-docker-compose-and-env.yml
|
@ -1,8 +1,15 @@
|
||||
services:
|
||||
application:
|
||||
build:
|
||||
context: {{ path_cymais_sphinx_output.stdout }}
|
||||
dockerfile: {{ path_cymais_sphinx_output.stdout }}/Dockerfile
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
args:
|
||||
SPHINX_SOURCE_DIR: {{docker_source_dir}}
|
||||
SPHINX_OUTPUT_DIR: {{docker_output_dir}}
|
||||
SPHINX_EXEC_DIR: {{docker_exec_dir}}
|
||||
SPHINX_DOCKER_EXEC_DIR: {{docker_app_dir}}
|
||||
SPHINX_SOURCE_DIR_RELATIVE: {{host_sphinx_source_dir_relative}}
|
||||
DOCKER_PYTHON_VERSION: {{applications[application_id].version}}
|
||||
ports:
|
||||
- "127.0.0.1:{{ports.localhost.http[application_id]}}:8000"
|
||||
healthcheck:
|
||||
|
@ -1 +1,12 @@
|
||||
application_id: "sphinx"
|
||||
application_id: "sphinx"
|
||||
|
||||
host_sphinx_source_dir_relative: "volumes/source/" # Place where the sphinx source repository is stored on the host
|
||||
host_sphinx_source_dir_absolute: "{{docker_compose.directories.instance}}{{host_sphinx_source_dir_relative}}" # Place where the sphinx source repository is stored on the host
|
||||
|
||||
docker_app_dir: "/app/" # Folder in which the application is running
|
||||
docker_source_dir: "{{docker_app_dir}}" # Folder which is used to be screened
|
||||
docker_output_dir: "/output/" # Folder to which the output is fuuuucking putted!
|
||||
docker_exec_dir: "{{ [ docker_app_dir, applications.sphinx.sphinx_exec_dir_relative ] | path_join }}" # Folder which contains the sphinxs makefile and logic
|
||||
|
||||
sphinx_host_dockerfile: "{{ docker_compose.directories.instance }}Dockerfile" # Path to the Dockerfile to build sphinx on the server
|
||||
sphinx_control_node_dockerfile: "{{ [ playbook_dir, 'docs/Dockerfile' ] | path_join }}" # Path to the Dockerfile on the control node
|
@ -1,27 +0,0 @@
|
||||
# Update pkgmgr
|
||||
|
||||
## Description
|
||||
|
||||
This role checks if the [package manager](https://github.com/kevinveenbirkenbach/package-manager) is available on the system. If so, it runs `pkgmgr update --all` to update all repositories managed by the `pkgmgr`.
|
||||
|
||||
## Overview
|
||||
|
||||
This role performs the following tasks:
|
||||
- Checks if the `pkgmgr` command is available.
|
||||
- If available, runs `pkgmgr update --all` to update all repositories.
|
||||
|
||||
## Purpose
|
||||
|
||||
The purpose of this role is to simplify system updates by using the `pkgmgr` package manager to handle all repository updates with a single command.
|
||||
|
||||
## Features
|
||||
|
||||
- **Conditional Execution**: Runs only if the `pkgmgr` command is found on the system.
|
||||
- **Automated Updates**: Automatically runs `pkgmgr update --all` to update all repositories.
|
||||
|
||||
## License
|
||||
|
||||
CyMaIS NonCommercial License (CNCL)
|
||||
[Learn More](https://s.veen.world/cncl)
|
||||
|
||||
|
@ -1 +0,0 @@
|
||||
pkgmgr_command: "pkgmgr"
|
@ -1,24 +0,0 @@
|
||||
---
|
||||
galaxy_info:
|
||||
author: "Kevin Veen-Birkenbach"
|
||||
description: "Checks if the pkgmgr command is available and runs 'pkgmgr update --all' to update all repositories."
|
||||
license: "CyMaIS NonCommercial License (CNCL)"
|
||||
license_url: "https://s.veen.world/cncl"
|
||||
company: |
|
||||
Kevin Veen-Birkenbach
|
||||
Consulting & Coaching Solutions
|
||||
https://www.veen.world
|
||||
min_ansible_version: "2.9"
|
||||
platforms:
|
||||
- name: Linux
|
||||
versions:
|
||||
- all
|
||||
galaxy_tags:
|
||||
- update
|
||||
- package-manager
|
||||
- pkgmgr
|
||||
- system
|
||||
repository: "https://s.veen.world/cymais"
|
||||
issue_tracker_url: "https://s.veen.world/cymaisissues"
|
||||
documentation: "https://s.veen.world/cymais"
|
||||
dependencies: []
|
@ -1,2 +0,0 @@
|
||||
- name: "Update all repositories with pkgmgr"
|
||||
command: "pkgmgr update --all"
|
@ -39,13 +39,4 @@
|
||||
- name: "Update with pip"
|
||||
include_role:
|
||||
name: update-pip
|
||||
|
||||
- name: "Check if pkgmgr command is available"
|
||||
command: "which pkgmgr"
|
||||
register: pkgmgr_available
|
||||
ignore_errors: yes
|
||||
|
||||
- name: "Update all repositories using pkgmgr"
|
||||
include_role:
|
||||
name: update-pkgmgr
|
||||
when: pkgmgr_available.rc == 0
|
||||
when: pip_installed.rc == 0
|
||||
|
Loading…
x
Reference in New Issue
Block a user