30 Commits

Author SHA1 Message Date
3132aab2a5 Release version 1.1.0 2026-03-30 10:47:32 +02:00
3d1db1f8ba Updated mirrors 2026-03-30 10:46:39 +02:00
58872ced81 fix(ci): grant security-events: write to lint job
The lint-docker job in lint.yml requires security-events: write
for SARIF upload; must be explicitly granted to the caller job.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-30 10:18:14 +02:00
13b3af3330 Entered and removed whitespaces in README.md 2026-03-30 10:16:53 +02:00
eca7084f4e fix(ci): grant security-events and packages permissions to security job
Reusable workflow calls inherit only explicitly granted permissions.
The nested security job requires packages: read and security-events: write
for CodeQL analysis.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-30 10:16:30 +02:00
6861b2c0eb fix(nav): prevent navbar items from wrapping to second line
- Set flex-wrap: nowrap on navbar-nav to keep all items in one row
- Add hidden overflow-x scroll (no visible scrollbar) as fallback
- Fix #navbar_logo taking up space when invisible via max-width transition

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-30 10:10:55 +02:00
66b1f0d029 feat(agents): add AGENTS.md and CLAUDE.md with pre-commit rules
- Add AGENTS.md: require make test before every non-doc commit and
  document the npm vendor asset workflow
- Add CLAUDE.md: instruct agents to read AGENTS.md at conversation start
- Add npm-install dependency to test-e2e Makefile target

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-30 09:59:09 +02:00
a29a0b1862 feat(vendor): replace CDN dependencies with local npm packages
Introduces a vendor build pipeline so all third-party browser assets
(Bootstrap, Bootstrap Icons, Font Awesome, marked, jQuery) are served
from local static files instead of external CDNs.

- Add app/package.json with vendor deps and postinstall/build scripts
- Add app/scripts/copy-vendor.js to copy assets to static/vendor/
- Update base.html.j2 to use url_for('static', ...) for all vendor assets
- Update Dockerfile to install Node.js/npm and run npm install
- Update .gitignore to exclude app/node_modules/ and app/static/vendor/

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-30 09:29:28 +02:00
252b50d2a7 feat: migrate to pyproject.toml, add test suites, split CI workflows
- Replace requirements.txt with pyproject.toml for modern Python packaging
- Add unit, integration, lint and security test suites under tests/
- Add utils/export_runtime_requirements.py and utils/check_hadolint_sarif.py
- Split monolithic CI into reusable lint.yml, security.yml and tests.yml
- Refactor ci.yml to orchestrate reusable workflows; publish on semver tag only
- Modernize Dockerfile: pin python:3.12-slim, install via pyproject.toml
- Expand Makefile with lint, security, test and CI targets
- Add test-e2e via act with portfolio container stop/start around run
- Fix navbar_logo_visibility.spec.js: win.fullscreen() → win.enterFullscreen()
- Set use_reloader=False in app.run() to prevent double-start in CI
- Add app/core.* and build artifacts to .gitignore
- Fix apt-get → sudo apt-get in tests.yml e2e job
- Fix pip install --ignore-installed to handle stale act cache

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-29 23:03:09 +02:00
2c61da9fc3 chore: remove comments from settings.json
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 23:37:10 +01:00
2d8185b747 chore: add Claude Code project permissions settings
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-28 23:16:29 +01:00
a47a5babce chore: remove unused imports 2026-03-28 19:08:57 +01:00
3f6c90cc3c Release version 1.0.0 2026-02-19 11:18:50 +01:00
69c4f15ce7 Prepare config.yaml from sample in CI before Cypress 2026-02-19 11:14:52 +01:00
56c1b0d0cd Fix Cypress action for repositories without lockfile 2026-02-19 11:13:12 +01:00
91e9caea48 Fix CI node cache path and npm install strategy 2026-02-19 11:11:11 +01:00
feb6af28ef Add CI workflow for tests and conditional image publish 2026-02-19 11:05:26 +01:00
f8c2b4236b Added d-flex to place logo next to brand 2025-07-21 14:24:28 +02:00
dc2626e020 Added test for log 2025-07-21 12:18:07 +02:00
46b0b744ca Added logo to navbar when in fullscreen 2025-07-21 11:39:59 +02:00
5f2e7ef696 Changed pkgmgr commands 2025-07-12 18:54:42 +02:00
152a85bfb8 Merge branch 'main' of github.com:kevinveenbirkenbach/portfolio 2025-07-12 18:53:03 +02:00
fdfe301868 Renamed PortWebUI to PortUI 2025-07-12 18:52:51 +02:00
cbfe1ed8ae Update README.md
Solved another layout bug
2025-07-12 18:27:47 +02:00
9470162236 Solved formatation bug 2025-07-12 18:26:51 +02:00
6a57fa1e00 Optimized README.md 2025-07-12 18:24:49 +02:00
ab67fc0b29 Renamed portfolio to PortWebUI 2025-07-12 18:22:19 +02:00
e18566d801 Solved some bugs 2025-07-09 22:20:58 +02:00
7bc0f32145 Added cypress tests 2025-07-08 17:16:57 +02:00
6ed3e60dd0 Solved 2tap fullscreen hight bug 2025-07-08 14:39:13 +02:00
61 changed files with 2667 additions and 530 deletions

89
.claude/settings.json Normal file
View File

@@ -0,0 +1,89 @@
{
"permissions": {
"allow": [
"Read",
"Edit",
"Write",
"Bash(git status*)",
"Bash(git log*)",
"Bash(git diff*)",
"Bash(git add*)",
"Bash(git commit*)",
"Bash(git checkout*)",
"Bash(git branch*)",
"Bash(git fetch*)",
"Bash(git stash*)",
"Bash(git -C:*)",
"Bash(make*)",
"Bash(python3*)",
"Bash(python*)",
"Bash(pip show*)",
"Bash(pip list*)",
"Bash(pip install*)",
"Bash(npm install*)",
"Bash(npm run*)",
"Bash(npx*)",
"Bash(docker pull*)",
"Bash(docker build*)",
"Bash(docker images*)",
"Bash(docker ps*)",
"Bash(docker inspect*)",
"Bash(docker logs*)",
"Bash(docker create*)",
"Bash(docker export*)",
"Bash(docker rm*)",
"Bash(docker rmi*)",
"Bash(docker stop*)",
"Bash(docker compose*)",
"Bash(docker-compose*)",
"Bash(docker container prune*)",
"Bash(grep*)",
"Bash(find*)",
"Bash(ls*)",
"Bash(cat*)",
"Bash(head*)",
"Bash(tail*)",
"Bash(wc*)",
"Bash(sort*)",
"Bash(tar*)",
"Bash(mkdir*)",
"Bash(cp*)",
"Bash(mv*)",
"Bash(jq*)",
"WebSearch",
"WebFetch(domain:github.com)",
"WebFetch(domain:raw.githubusercontent.com)",
"WebFetch(domain:api.github.com)",
"WebFetch(domain:docs.docker.com)",
"WebFetch(domain:pypi.org)",
"WebFetch(domain:docs.cypress.io)",
"WebFetch(domain:flask.palletsprojects.com)"
],
"ask": [
"Bash(git push*)",
"Bash(docker run*)",
"Bash(curl*)"
],
"deny": [
"Bash(git push --force*)",
"Bash(git reset --hard*)",
"Bash(rm -rf*)",
"Bash(sudo*)"
]
},
"sandbox": {
"filesystem": {
"allowWrite": [
".",
"/tmp"
],
"denyRead": [
"~/.ssh",
"~/.gnupg",
"~/.kube",
"~/.aws",
"~/.config/gcloud"
]
}
}
}

90
.github/workflows/ci.yml vendored Normal file
View File

@@ -0,0 +1,90 @@
name: CI
on:
pull_request:
push:
branches:
- "**"
tags-ignore:
- "**"
permissions:
contents: read
jobs:
security:
name: Run security workflow
uses: ./.github/workflows/security.yml
permissions:
contents: read
packages: read
security-events: write
tests:
name: Run test workflow
uses: ./.github/workflows/tests.yml
lint:
name: Run lint workflow
uses: ./.github/workflows/lint.yml
permissions:
contents: read
security-events: write
publish:
name: Publish image
runs-on: ubuntu-latest
needs:
- security
- tests
- lint
if: github.event_name == 'push'
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: actions/checkout@v6
with:
fetch-depth: 0
- name: Detect semver tag on current commit
id: semver
run: |
SEMVER_TAG="$(git tag --points-at "$GITHUB_SHA" | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1 || true)"
if [ -n "$SEMVER_TAG" ]; then
{
echo "found=true"
echo "raw_tag=$SEMVER_TAG"
echo "version=${SEMVER_TAG#v}"
} >> "$GITHUB_OUTPUT"
else
echo "found=false" >> "$GITHUB_OUTPUT"
fi
- name: Compute image name
if: steps.semver.outputs.found == 'true'
id: image
run: echo "name=ghcr.io/$(echo "${GITHUB_REPOSITORY}" | tr '[:upper:]' '[:lower:]')" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
if: steps.semver.outputs.found == 'true'
uses: docker/setup-buildx-action@v3
- name: Login to GHCR
if: steps.semver.outputs.found == 'true'
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and publish image
if: steps.semver.outputs.found == 'true'
uses: docker/build-push-action@v6
with:
context: .
file: ./Dockerfile
push: true
tags: ${{ steps.image.outputs.name }}:${{ steps.semver.outputs.version }}

77
.github/workflows/lint.yml vendored Normal file
View File

@@ -0,0 +1,77 @@
name: Lint
on:
workflow_call:
workflow_dispatch:
permissions:
contents: read
jobs:
lint-actions:
name: Lint GitHub Actions
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Run actionlint
run: docker run --rm -v "$PWD:/repo" -w /repo rhysd/actionlint:latest
lint-python:
name: Lint Python
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install lint dependencies
run: |
python -m pip install --upgrade pip
pip install ".[dev]"
- name: Ruff lint
run: ruff check .
- name: Ruff format check
run: ruff format --check .
lint-docker:
name: Lint Dockerfile
runs-on: ubuntu-latest
permissions:
contents: read
security-events: write
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Run hadolint
id: hadolint
continue-on-error: true
uses: hadolint/hadolint-action@2332a7b74a6de0dda2e2221d575162eba76ba5e5
with:
dockerfile: ./Dockerfile
format: sarif
output-file: hadolint-results.sarif
failure-threshold: warning
- name: Upload hadolint SARIF
if: always() && github.event_name == 'push'
uses: github/codeql-action/upload-sarif@v4
with:
sarif_file: hadolint-results.sarif
wait-for-processing: true
category: hadolint
- name: Fail on hadolint warnings
if: always()
run: python3 utils/check_hadolint_sarif.py hadolint-results.sarif

48
.github/workflows/security.yml vendored Normal file
View File

@@ -0,0 +1,48 @@
name: Security
on:
workflow_call:
permissions:
contents: read
jobs:
analyze:
name: Run security scan
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
permissions:
contents: read
packages: read
security-events: write
strategy:
fail-fast: false
matrix:
include:
- language: actions
build-mode: none
- language: javascript-typescript
build-mode: none
- language: python
build-mode: none
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Initialize CodeQL
uses: github/codeql-action/init@v4
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
queries: security-extended,security-and-quality
- name: Run manual build steps
if: matrix.build-mode == 'manual'
run: |
echo "No manual build is configured for this repository."
exit 1
- name: Perform CodeQL analysis
uses: github/codeql-action/analyze@v4
with:
category: /language:${{ matrix.language }}

194
.github/workflows/tests.yml vendored Normal file
View File

@@ -0,0 +1,194 @@
name: Tests
on:
workflow_call:
workflow_dispatch:
permissions:
contents: read
jobs:
test-lint:
name: Run lint tests
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Run lint test suite
run: python -m unittest discover -s tests/lint -t .
test-integration:
name: Run integration tests
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install integration test dependencies
run: |
python -m pip install --upgrade pip
pip install --ignore-installed .
- name: Run integration test suite
run: python -m unittest discover -s tests/integration -t .
test-unit:
name: Run unit tests
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install unit test dependencies
run: |
python -m pip install --upgrade pip
pip install --ignore-installed .
- name: Run unit test suite
run: python -m unittest discover -s tests/unit -t .
security-python:
name: Run Python security checks
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install security dependencies
run: |
python -m pip install --upgrade pip
pip install --ignore-installed ".[dev]"
- name: Run Bandit
run: python -m bandit -q -ll -ii -r app main.py
- name: Export runtime requirements
run: python utils/export_runtime_requirements.py > runtime-requirements.txt
- name: Audit Python runtime dependencies
run: python -m pip_audit -r runtime-requirements.txt
test-security:
name: Run security guardrail tests
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install security test dependencies
run: |
python -m pip install --upgrade pip
pip install --ignore-installed .
- name: Run security test suite
run: python -m unittest discover -s tests/security -t .
e2e:
name: Run end-to-end tests
runs-on: ubuntu-latest
needs:
- test-lint
- test-unit
- test-integration
- security-python
- test-security
env:
FLASK_HOST: "127.0.0.1"
FLASK_PORT: "5001"
PORT: "5001"
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install --ignore-installed .
- name: Prepare app config for CI
run: cp app/config.sample.yaml app/config.yaml
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
cache: npm
cache-dependency-path: app/package.json
- name: Install Node dependencies
working-directory: app
run: npm install
- name: Install Cypress system dependencies
run: |
sudo apt-get update
sudo apt-get install -y \
libasound2t64 \
libatk-bridge2.0-0 \
libatk1.0-0 \
libatspi2.0-0t64 \
libcups2t64 \
libdrm2 \
libgbm1 \
libglib2.0-0t64 \
libgtk-3-0t64 \
libnotify4 \
libnspr4 \
libnss3 \
libpango-1.0-0 \
libpangocairo-1.0-0 \
libxcomposite1 \
libxdamage1 \
libxfixes3 \
libxkbcommon0 \
libxrandr2 \
libxss1 \
libxtst6 \
xauth \
xvfb
- name: Run Cypress tests
uses: cypress-io/github-action@v6
with:
working-directory: app
install: false
start: python app.py
wait-on: http://127.0.0.1:5001
wait-on-timeout: 120

8
.gitignore vendored
View File

@@ -2,3 +2,11 @@ app/config.yaml
*__pycache__*
app/static/cache/*
.env
app/cypress/screenshots/*
.ruff_cache/
app/node_modules/
app/static/vendor/
hadolint-results.sarif
build/
*.egg-info/
app/core.*

13
AGENTS.md Normal file
View File

@@ -0,0 +1,13 @@
# Agent Instructions
## Pre-Commit Validation
- You MUST run `make test` before every commit whenever the staged change includes at least one file that is not `.md` or `.rst`, unless explicitly instructed otherwise.
- You MUST commit only after all tests pass.
- You MUST NOT commit automatically without explicit confirmation from the user.
## Vendor Assets
- Browser vendor assets (Bootstrap, Font Awesome, etc.) are managed via npm.
- Run `npm install` inside `app/` to populate `app/static/vendor/` before starting the dev server or running e2e tests.
- Never commit `app/node_modules/` or `app/static/vendor/` — both are gitignored and generated at build time.

15
CHANGELOG.md Normal file
View File

@@ -0,0 +1,15 @@
## [1.1.0] - 2026-03-30
* *CI stabilization and modularization*: Split into reusable workflows (lint, security, tests) with correct permissions for CodeQL and SARIF uploads
* *Modern Python packaging*: Migration to pyproject.toml and updated Dockerfile using Python 3.12
* *Improved test coverage*: Added unit, integration, lint, security, and E2E tests using act
* *Local vendor assets*: Replaced external CDNs with npm-based local asset pipeline
* *Enhanced build workflow*: Extended Makefile with targets for test, lint, security, and CI plus vendor build process
* *Frontend fix*: Prevented navbar wrapping and improved layout behavior
* *Developer guidelines*: Introduced AGENTS.md and CLAUDE.md with enforced pre-commit rules
## [1.0.0] - 2026-02-19
* Official Release🥳

5
CLAUDE.md Normal file
View File

@@ -0,0 +1,5 @@
# CLAUDE.md
## Startup
You MUST read `AGENTS.md` and follow all instructions in it at the start of every conversation before doing anything else.

View File

@@ -1,14 +1,20 @@
# Base image for Python
FROM python:slim
FROM python:3.12-slim
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
FLASK_HOST=0.0.0.0
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends nodejs npm && rm -rf /var/lib/apt/lists/*
WORKDIR /tmp/build
COPY pyproject.toml README.md main.py ./
COPY app ./app
RUN python -m pip install --no-cache-dir .
# Set the working directory
WORKDIR /app
# Copy and install dependencies
COPY app/requirements.txt requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY app/ .
RUN npm install --prefix /app
CMD ["python", "app.py"]

4
MIRRORS Normal file
View File

@@ -0,0 +1,4 @@
git@github.com:kevinveenbirkenbach/port-ui.git
ssh://git@code.infinito.nexus:2201/kevinveenbirkenbach/port-ui.git
ssh://git@git.veen.world:2201/kevinveenbirkenbach/port-ui.git

174
Makefile Normal file
View File

@@ -0,0 +1,174 @@
# Load environment variables from .env
ifneq (,$(wildcard .env))
include .env
# Export variables defined in .env
export $(shell sed 's/=.*//' .env)
endif
# Default port (can be overridden with PORT env var)
PORT ?= 5000
PYTHON ?= python3
ACT ?= act
# Default port (can be overridden with PORT env var)
.PHONY: build
build:
# Build the Docker image.
docker build -t application-portfolio .
.PHONY: build-no-cache
build-no-cache:
# Build the Docker image without cache.
docker build --no-cache -t application-portfolio .
.PHONY: up
up:
# Start the application using docker-compose with build.
docker-compose up -d --build --force-recreate
.PHONY: down
down:
# Stop and remove the 'portfolio' container, ignore errors, and bring down compose.
- docker stop portfolio || true
- docker rm portfolio || true
- docker-compose down
.PHONY: run-dev
run-dev:
# Run the container in development mode (hot-reload).
docker run -d \
-p $(PORT):$(PORT) \
--name portfolio \
-v $(PWD)/app/:/app \
-e FLASK_APP=app.py \
-e FLASK_ENV=development \
application-portfolio
.PHONY: run-prod
run-prod:
# Run the container in production mode.
docker run -d \
-p $(PORT):$(PORT) \
--name portfolio \
application-portfolio
.PHONY: logs
logs:
# Display the logs of the 'portfolio' container.
docker logs -f portfolio
.PHONY: dev
dev:
# Start the application in development mode using docker-compose.
FLASK_ENV=development docker-compose up -d
.PHONY: prod
prod:
# Start the application in production mode using docker-compose (with build).
docker-compose up -d --build
.PHONY: cleanup
cleanup:
# Remove all stopped Docker containers to reclaim space.
docker container prune -f
.PHONY: delete
delete:
# Force remove the 'portfolio' container if it exists.
- docker rm -f portfolio
.PHONY: browse
browse:
# Open the application in the browser at http://localhost:$(PORT)
chromium http://localhost:$(PORT)
.PHONY: install
install:
# Install runtime Python dependencies from pyproject.toml.
$(PYTHON) -m pip install -e .
.PHONY: install-dev
install-dev:
# Install runtime and developer dependencies from pyproject.toml.
$(PYTHON) -m pip install -e ".[dev]"
.PHONY: npm-install
npm-install:
# Install Node.js dependencies for browser tests.
cd app && npm install
.PHONY: lint-actions
lint-actions:
# Lint GitHub Actions workflows.
docker run --rm -v "$$PWD:/repo" -w /repo rhysd/actionlint:latest
.PHONY: lint-python
lint-python: install-dev
# Run Python lint and format checks.
$(PYTHON) -m ruff check .
$(PYTHON) -m ruff format --check .
.PHONY: lint-docker
lint-docker:
# Lint the Dockerfile.
docker run --rm -i hadolint/hadolint < Dockerfile
.PHONY: test-lint
test-lint:
# Run lint guardrail tests.
$(PYTHON) -m unittest discover -s tests/lint -t .
.PHONY: test-integration
test-integration: install
# Run repository integration tests.
$(PYTHON) -m unittest discover -s tests/integration -t .
.PHONY: test-unit
test-unit: install
# Run repository unit tests.
$(PYTHON) -m unittest discover -s tests/unit -t .
.PHONY: test-security
test-security: install
# Run repository security guardrail tests.
$(PYTHON) -m unittest discover -s tests/security -t .
.PHONY: lint
lint: lint-actions lint-python lint-docker test-lint
# Run the full lint suite.
.PHONY: security
security: install-dev test-security
# Run security checks.
$(PYTHON) -m bandit -q -ll -ii -r app main.py
$(PYTHON) utils/export_runtime_requirements.py > /tmp/portfolio-runtime-requirements.txt
$(PYTHON) -m pip_audit -r /tmp/portfolio-runtime-requirements.txt
.PHONY: test-e2e
test-e2e: npm-install
# Run Cypress end-to-end tests via act (stop portfolio container to free port first).
-docker stop portfolio 2>/dev/null || true
$(ACT) workflow_dispatch -W .github/workflows/tests.yml -j e2e
-docker start portfolio 2>/dev/null || true
.PHONY: test-workflow
test-workflow:
# Run the GitHub test workflow locally via act.
$(ACT) workflow_dispatch -W .github/workflows/tests.yml
.PHONY: lint-workflow
lint-workflow:
# Run the GitHub lint workflow locally via act.
$(ACT) workflow_dispatch -W .github/workflows/lint.yml
.PHONY: quality
quality: lint-workflow test-workflow
# Run the GitHub lint and test workflows locally via act.
.PHONY: ci
ci: lint security test-unit test-integration test-e2e
# Run the local CI suite.
.PHONY: test
test: ci
# Run the full validation suite.

142
README.md
View File

@@ -1,86 +1,95 @@
# Portfolio CMS: Flask-based Portfolio Management 🚀
# PortUI 🖥️✨
[![GitHub Sponsors](https://img.shields.io/badge/Sponsor-GitHub%20Sponsors-blue?logo=github)](https://github.com/sponsors/kevinveenbirkenbach) [![Patreon](https://img.shields.io/badge/Support-Patreon-orange?logo=patreon)](https://www.patreon.com/c/kevinveenbirkenbach) [![Buy Me a Coffee](https://img.shields.io/badge/Buy%20me%20a%20Coffee-Funding-yellow?logo=buymeacoffee)](https://buymeacoffee.com/kevinveenbirkenbach) [![PayPal](https://img.shields.io/badge/Donate-PayPal-blue?logo=paypal)](https://s.veen.world/paypaldonate)
This software allows individuals and institutions to set up an easy portfolio/landingpage/homepage to showcase their projects and online presence. It is highly customizable via a YAML configuration file.
A lightweight, Docker-powered portfolio/landing-page generator—fully customizable via YAML! Showcase your projects, skills, and online presence in minutes.
## Features ✨
> 🚀 You can also pair PortUI with JavaScript for sleek, web-based desktop-style interfaces.
> 💻 Example in action: [CyMaIS.Cloud](https://cymais.cloud/) (demo)
> 🌐 Another live example: [veen.world](https://www.veen.world/) (Kevins personal site)
- **Dynamic Navigation:** Easily create dropdown menus and nested links.
- **Customizable Cards:** Showcase your skills, projects, or services.
- **Cache Management:** Optimize your assets with automatic caching.
- **Responsive Design:** Beautiful on any device with Bootstrap.
- **Easy Configuration:** Update content using a YAML file.
- **Command Line Interface:** Manage Docker containers with the `portfolio` CLI.
---
## Access 🌐
## ✨ Key Features
### Local Access
Access the application locally at [http://127.0.0.1:5000](http://127.0.0.1:5000).
- **Dynamic Navigation**
Create dropdowns & nested menus with ease.
- **Customizable Cards**
Highlight skills, projects, or services—with icons, titles, and links.
- **Smart Cache Management**
Auto-cache assets for lightning-fast loading.
- **Responsive Design**
Built on Bootstrap; looks great on desktop, tablet & mobile.
- **YAML-Driven**
All content & structure defined in a simple `config.yaml`.
- **CLI Control**
Manage Docker containers via the `portfolio` command.
## Getting Started 🏁
---
### Prerequisites 📋
## 🌐 Quick Access
- Docker and Docker Compose installed on your system.
- Basic knowledge of Python and YAML for configuration.
- **Local Preview:**
[http://127.0.0.1:5000](http://127.0.0.1:5000)
### Installation 🛠️
---
#### Installation via git clone
## 🏁 Getting Started
1. **Clone the repository:**
### 🔧 Prerequisites
- Docker & Docker Compose
- Basic Python & YAML knowledge
### 🛠️ Installation via Git
1. **Clone & enter repo**
```bash
git clone <repository_url>
cd <repository_directory>
```
2. **Update the configuration:**
Create a `config.yaml` file. You can use `config.sample.yaml` as an example (see below for details on the configuration).
2. **Configure**
Copy `config.sample.yaml` → `config.yaml` & customize.
3. **Build & run**
3. **Build and run the Docker container:**
```bash
docker-compose up --build
```
4. **Browse**
Open [http://localhost:5000](http://localhost:5000)
4. **Access your portfolio:**
Open your browser and navigate to [http://localhost:5000](http://localhost:5000).
### Installation via Kevin's Package Manager
You can install the `portfolio` CLI using [Kevin's package manager](https://github.com/kevinveenbirkenbach/package-manager). Simply run:
### 📦 Installation via Kevins Package Manager
```bash
pkgmgr install portfolio
pkgmgr install portui
```
This will install the CLI tool, making it available system-wide.
Once installed, the `portui` CLI is available system-wide.
### Available Commands
---
After installation, you can access the help information for the CLI by running:
## 🖥️ CLI Commands
```bash
portfolio --help
portui --help
```
This command displays detailed instructions on how to use the following commands:
* `build`Build the Docker image
* `up`Start containers (with build)
* `down`Stop & remove containers
* `run-dev`Dev mode (hot-reload)
* `run-prod`Production mode
* `logs`View container logs
* `dev`Docker-Compose dev environment
* `prod`Docker-Compose prod environment
* `cleanup`Prune stopped containers
- **build:** Build the Docker image for the portfolio application.
- **up:** Start the application using docker-compose (with build).
- **down:** Stop and remove the Docker container.
- **run-dev:** Run the container in development mode with hot-reloading.
- **run-prod:** Run the container in production mode.
- **logs:** Display the logs of the running container.
- **dev:** Start the application in development mode using docker-compose.
- **prod:** Start the application in production mode using docker-compose.
- **cleanup:** Remove all stopped Docker containers to clean up your Docker environment.
---
## YAML Configuration Guide 🔧
## 🔧 YAML Configuration Guide
The portfolio is powered by a YAML configuration file (`config.yaml`). This file allows you to define the structure and content of your site, including cards, navigation, and company details.
### YAML Configuration Example 📄
Define your sites structure in `config.yaml`:
```yaml
accounts:
@@ -93,14 +102,9 @@ accounts:
description: Platforms where I share content.
icon:
class: fas fa-newspaper
children:
- name: Microblogs
description: Stay updated with my microblog posts.
icon:
class: fa-solid fa-pen-nib
children:
- name: Mastodon
description: Follow my updates on Mastodon.
description: Follow me on Mastodon.
icon:
class: fa-brands fa-mastodon
url: https://microblog.veen.world/@kevinveenbirkenbach
@@ -112,9 +116,10 @@ accounts:
text: I lead agile transformations and improve team dynamics through Scrum and Agile Coaching.
url: https://www.agile-coach.world
link_text: www.agile-coach.world
company:
titel: Kevin Veen-Birkenbach
subtitel: Consulting and Coaching Solutions
title: Kevin Veen-Birkenbach
subtitle: Consulting & Coaching Solutions
logo:
source: https://cloud.veen.world/s/logo_face_512x512/download
favicon:
@@ -127,26 +132,27 @@ company:
imprint_url: https://s.veen.world/imprint
```
### Understanding the `children` Key 🔍
* **`children`** enables multi-level menus.
* **`link`** references other YAML paths to avoid duplication.
The `children` key allows hierarchical nesting of elements. Each child can itself have children, enabling the creation of multi-level navigation menus or grouped content.
---
### Understanding the `link` Key 🔗
## 🚢 Production Deployment
The `link` key allows you to reference another part of the YAML configuration by its path, which helps avoid duplication and maintain consistency.
* Use a reverse proxy (NGINX/Apache).
* Secure with SSL/TLS.
* Swap to a production database if needed.
## Deployment 🚢
---
For production deployment, ensure to:
## 📜 License
- Use a reverse proxy like NGINX or Apache.
- Secure your site with SSL/TLS.
- Use a production-ready database if required.
Licensed under **GNU AGPLv3**. See [LICENSE](./LICENSE) for details.
## License 📜
---
This project is licensed under the GNU Affero General Public License Version 3. See the [LICENSE](./LICENSE) file for details.
## ✍️ Author
## Author ✍️
Created by [Kevin Veen-Birkenbach](https://www.veen.world/)
This software was created by [Kevin Veen-Birkenbach](https://www.veen.world/).
Enjoy building your portfolio! 🌟

2
app/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
node_modules/
package-lock.json

1
app/__init__.py Normal file
View File

@@ -0,0 +1 @@
"""Portfolio UI web application package."""

View File

@@ -1,18 +1,26 @@
import logging
import os
from flask import Flask, render_template
import yaml
import requests
from utils.configuration_resolver import ConfigurationResolver
import yaml
from flask import Flask, current_app, render_template
from markupsafe import Markup
try:
from app.utils.cache_manager import CacheManager
from app.utils.compute_card_classes import compute_card_classes
from app.utils.configuration_resolver import ConfigurationResolver
except ImportError: # pragma: no cover - supports running from the app/ directory.
from utils.cache_manager import CacheManager
from utils.compute_card_classes import compute_card_classes
import logging
logging.basicConfig(level=logging.DEBUG)
FLASK_ENV = os.getenv("FLASK_ENV", "production")
FLASK_PORT = int(os.getenv("PORT", 5000))
print(f"🔧 Starting app on port {FLASK_PORT}, FLASK_ENV={FLASK_ENV}")
from utils.configuration_resolver import ConfigurationResolver
from flask import Flask, render_template, current_app
from markupsafe import Markup
logging.basicConfig(level=logging.DEBUG)
FLASK_ENV = os.getenv("FLASK_ENV", "production")
FLASK_HOST = os.getenv("FLASK_HOST", "127.0.0.1")
FLASK_PORT = int(os.getenv("FLASK_PORT", os.getenv("PORT", 5000)))
print(f"Starting app on {FLASK_HOST}:{FLASK_PORT}, FLASK_ENV={FLASK_ENV}")
# Initialize the CacheManager
cache_manager = CacheManager()
@@ -20,10 +28,11 @@ cache_manager = CacheManager()
# Clear cache on startup
cache_manager.clear_cache()
def load_config(app):
"""Load and resolve the configuration from config.yaml."""
with open("config.yaml", "r") as f:
config = yaml.safe_load(f)
with open("config.yaml", "r", encoding="utf-8") as handle:
config = yaml.safe_load(handle)
if config.get("nasa_api_key"):
app.config["NASA_API_KEY"] = config["nasa_api_key"]
@@ -32,16 +41,27 @@ def load_config(app):
resolver.resolve_links()
app.config.update(resolver.get_config())
def cache_icons_and_logos(app):
"""Cache all icons and logos to local files."""
"""Cache all icons and logos to local files, with a source fallback."""
for card in app.config["cards"]:
icon = card.get("icon", {})
if icon.get("source"):
icon["cache"] = cache_manager.cache_file(icon["source"])
cached = cache_manager.cache_file(icon["source"])
icon["cache"] = cached or icon["source"]
company_logo = app.config["company"]["logo"]
cached = cache_manager.cache_file(company_logo["source"])
company_logo["cache"] = cached or company_logo["source"]
favicon = app.config["platform"]["favicon"]
cached = cache_manager.cache_file(favicon["source"])
favicon["cache"] = cached or favicon["source"]
platform_logo = app.config["platform"]["logo"]
cached = cache_manager.cache_file(platform_logo["source"])
platform_logo["cache"] = cached or platform_logo["source"]
app.config["company"]["logo"]["cache"] = cache_manager.cache_file(app.config["company"]["logo"]["source"])
app.config["platform"]["favicon"]["cache"] = cache_manager.cache_file(app.config["platform"]["favicon"]["source"])
app.config["platform"]["logo"]["cache"] = cache_manager.cache_file(app.config["platform"]["logo"]["source"])
# Initialize Flask app
app = Flask(__name__)
@@ -50,18 +70,22 @@ app = Flask(__name__)
load_config(app)
cache_icons_and_logos(app)
@app.context_processor
def utility_processor():
def include_svg(path):
full_path = os.path.join(current_app.root_path, 'static', path)
full_path = os.path.join(current_app.root_path, "static", path)
try:
with open(full_path, 'r', encoding='utf-8') as f:
svg = f.read()
return Markup(svg)
except IOError:
return Markup(f'<!-- SVG not found: {path} -->')
with open(full_path, "r", encoding="utf-8") as handle:
svg = handle.read()
# Trusted local SVG asset shipped with the application package.
return Markup(svg) # nosec B704
except OSError:
return ""
return dict(include_svg=include_svg)
@app.before_request
def reload_config_in_dev():
"""Reload config and recache icons before each request in development mode."""
@@ -69,22 +93,22 @@ def reload_config_in_dev():
load_config(app)
cache_icons_and_logos(app)
@app.route('/')
@app.route("/")
def index():
"""Render the main index page."""
cards = app.config["cards"]
lg_classes, md_classes = compute_card_classes(cards)
# fetch NASA APOD URL only if key present
apod_bg = None
api_key = app.config.get("NASA_API_KEY")
if api_key:
resp = requests.get(
"https://api.nasa.gov/planetary/apod",
params={"api_key": api_key}
params={"api_key": api_key},
timeout=10,
)
if resp.ok:
data = resp.json()
# only use if it's an image
if data.get("media_type") == "image":
apod_bg = data.get("url")
@@ -96,8 +120,14 @@ def index():
platform=app.config["platform"],
lg_classes=lg_classes,
md_classes=md_classes,
apod_bg=apod_bg
apod_bg=apod_bg,
)
if __name__ == "__main__":
app.run(debug=(FLASK_ENV == "development"), host="0.0.0.0", port=FLASK_PORT)
app.run(
debug=(FLASK_ENV == "development"),
host=FLASK_HOST,
port=FLASK_PORT,
use_reloader=False,
)

19
app/cypress.config.js Normal file
View File

@@ -0,0 +1,19 @@
// cypress.config.js
const { defineConfig } = require('cypress');
module.exports = defineConfig({
e2e: {
// your app under test must already be running on this port
baseUrl: `http://localhost:${process.env.PORT || 5001}`,
defaultCommandTimeout: 60000,
pageLoadTimeout: 60000,
requestTimeout: 1500,
responseTimeout: 15000,
specPattern: 'cypress/e2e/**/*.spec.js',
supportFile: false,
setupNodeEvents(on, config) {
// here you could hook into events, but we dont need anything special
return config;
}
},
});

View File

@@ -0,0 +1,90 @@
// cypress/e2e/container.spec.js
describe('Custom Scroll & Container Resizing', () => {
beforeEach(() => {
// Assumes your app is running at baseUrl, and container.js is loaded on “/”
cy.visit('/');
});
it('on load, the scroll-container gets a positive height and proper overflow', () => {
// wait for our JS to run
cy.window().should('have.property', 'adjustScrollContainerHeight');
// Grab the inline style of .scroll-container
cy.get('.scroll-container')
.should('have.attr', 'style')
.then(style => {
// height:<number>px must be present
const m = style.match(/height:\s*(\d+(?:\.\d+)?)px/);
expect(m, 'height set').to.not.be.null;
expect(parseFloat(m[1]), 'height > 0').to.be.greaterThan(0);
// overflow shorthand should include both hidden & auto (order-insensitive)
expect(style).to.include('overflow:');
expect(style).to.match(/overflow:\s*(hidden\s+auto|auto\s+hidden)/);
});
});
it('on window resize, scroll-container height updates', () => {
// record original height
cy.get('.scroll-container')
.invoke('css', 'height')
.then(orig => {
// resize to a smaller viewport
cy.viewport(320, 480);
cy.wait(100); // allow resize handler to fire
cy.get('.scroll-container')
.invoke('css', 'height')
.then(newH => {
expect(parseFloat(newH), 'height changed on resize').to.not.equal(parseFloat(orig));
});
});
});
context('custom scrollbar thumb', () => {
beforeEach(() => {
// inject tall content to force scrolling
cy.get('.scroll-container').then($sc => {
$sc[0].innerHTML = '<div style="height:2000px">long</div>';
});
// re-run scrollbar setup
cy.window().invoke('updateCustomScrollbar');
});
it('shows a thumb with reasonable size & position', () => {
cy.get('#custom-scrollbar').should('have.css', 'opacity', '1');
cy.get('#scroll-thumb')
.should('have.css', 'height')
.then(h => {
const hh = parseFloat(h);
expect(hh).to.be.at.least(20);
// ensure thumb is smaller than container
cy.get('#custom-scrollbar')
.invoke('css', 'height')
.then(ch => {
expect(hh).to.be.lessThan(parseFloat(ch));
});
});
// scroll a bit and verify thumb.top changes
cy.get('.scroll-container').scrollTo(0, 200);
cy.wait(50);
cy.get('#scroll-thumb')
.invoke('css', 'top')
.then(t => {
expect(parseFloat(t)).to.be.greaterThan(0);
});
});
it('hides scrollbar when content fits', () => {
// remove overflow
cy.get('.scroll-container').then($sc => {
$sc[0].innerHTML = '<div style="height:10px">tiny</div>';
});
cy.window().invoke('updateCustomScrollbar');
cy.get('#custom-scrollbar').should('have.css', 'opacity', '0');
});
});
});

View File

@@ -0,0 +1,85 @@
// cypress/e2e/fullscreen.spec.js
describe('Fullscreen Toggle', () => {
const ROOT = '/';
beforeEach(() => {
cy.visit(ROOT);
});
it('defaults to normal mode when no fullscreen param is present', () => {
// Body should not have fullscreen class
cy.get('body').should('not.have.class', 'fullscreen');
// URL should not include `fullscreen`
cy.url().should('not.include', 'fullscreen=');
// Header and footer should be visible (max-height > 0)
cy.get('header').should('have.css', 'max-height').and(value => {
expect(parseFloat(value)).to.be.greaterThan(0);
});
cy.get('footer').should('have.css', 'max-height').and(value => {
expect(parseFloat(value)).to.be.greaterThan(0);
});
});
it('initFullscreenFromUrl() picks up ?fullscreen=1 on load', () => {
cy.visit(`${ROOT}?fullscreen=1`);
cy.get('body').should('have.class', 'fullscreen');
cy.url().should('include', 'fullscreen=1');
// Header and footer should be collapsed (max-height == 0)
cy.get('header').should('have.css', 'max-height', '0px');
cy.get('footer').should('have.css', 'max-height', '0px');
});
it('enterFullscreen() adds fullscreen class, sets full width, and updates URL', () => {
cy.window().then(win => {
win.exitFullscreen(); // ensure starting state
win.enterFullscreen();
});
cy.get('body').should('have.class', 'fullscreen');
cy.url().should('include', 'fullscreen=1');
cy.get('.container, .container-fluid')
.should('have.class', 'container-fluid');
cy.get('header').should('have.css', 'max-height', '0px');
cy.get('footer').should('have.css', 'max-height', '0px');
});
it('exitFullscreen() removes fullscreen class, resets width, and URL param', () => {
// start in fullscreen
cy.window().invoke('enterFullscreen');
// then exit
cy.window().invoke('exitFullscreen');
cy.get('body').should('not.have.class', 'fullscreen');
cy.url().should('not.include', 'fullscreen=');
cy.get('.container, .container-fluid')
.should('have.class', 'container')
.and('not.have.class', 'container-fluid');
// Header and footer should be expanded again
cy.get('header').should('have.css', 'max-height').and(value => {
expect(parseFloat(value)).to.be.greaterThan(0);
});
cy.get('footer').should('have.css', 'max-height').and(value => {
expect(parseFloat(value)).to.be.greaterThan(0);
});
});
it('toggleFullscreen() toggles into and out of fullscreen', () => {
// Toggle into fullscreen
cy.window().invoke('toggleFullscreen');
cy.get('body').should('have.class', 'fullscreen');
cy.url().should('include', 'fullscreen=1');
// Toggle back
cy.window().invoke('toggleFullscreen');
cy.get('body').should('not.have.class', 'fullscreen');
cy.url().should('not.include', 'fullscreen=');
});
});

View File

@@ -0,0 +1,61 @@
// cypress/e2e/fullwidth.spec.js
describe('Full-width Toggle', () => {
// test page must include your <div class="container"> wrapper
const ROOT = '/';
it('defaults to .container when no param is present', () => {
cy.visit(ROOT);
cy.get('.container, .container-fluid')
.should('have.class', 'container')
.and('not.have.class', 'container-fluid');
// URL should not include `fullwidth`
cy.url().should('not.include', 'fullwidth=');
});
it('initFullWidthFromUrl() picks up ?fullwidth=1 on load', () => {
cy.visit(`${ROOT}?fullwidth=1`);
cy.get('.container, .container-fluid')
.should('have.class', 'container-fluid')
.and('not.have.class', 'container');
cy.url().should('include', 'fullwidth=1');
});
it('setFullWidth(true) switches to container-fluid and updates URL', () => {
cy.visit(ROOT);
// call your global function
cy.window().invoke('setFullWidth', true);
cy.get('.container, .container-fluid')
.should('have.class', 'container-fluid')
.and('not.have.class', 'container');
cy.url().should('include', 'fullwidth=1');
});
it('setFullWidth(false) reverts to container and removes URL param', () => {
cy.visit(`${ROOT}?fullwidth=1`);
// now reset
cy.window().invoke('setFullWidth', false);
cy.get('.container, .container-fluid')
.should('have.class', 'container')
.and('not.have.class', 'container-fluid');
cy.url().should('not.include', 'fullwidth=1');
});
it('updateUrlFullWidth() toggles the query param without changing layout', () => {
cy.visit(ROOT);
// manually toggle URL only
cy.window().invoke('updateUrlFullWidth', true);
cy.url().should('include', 'fullwidth=1');
cy.window().invoke('updateUrlFullWidth', false);
cy.url().should('not.include', 'fullwidth=');
});
});

View File

@@ -0,0 +1,46 @@
// cypress/e2e/iframe.spec.js
describe('Iframe integration', () => {
beforeEach(() => {
// Visit the apps base URL (configured in cypress.config.js)
cy.visit('/');
});
it('opens the iframe when an .iframe-link is clicked', () => {
// Find the first iframe-link on the page
cy.get('.iframe-link').first().then($link => {
const href = $link.prop('href');
// Click it
cy.wrap($link).click();
// The URL should now include ?iframe=<encoded href>
cy.url().should('include', 'iframe=' + encodeURIComponent(href));
// The <body> should have the "fullscreen" class
cy.get('body').should('have.class', 'fullscreen');
// And the <main> should contain a visible <iframe src="<href>">
cy.get('main iframe')
.should('have.attr', 'src', href)
.and('be.visible');
});
});
it('restores the original content when a .js-restore element is clicked', () => {
// First open the iframe
cy.get('.iframe-link').first().click();
// Then click the first .js-restore element (e.g. header or logo)
cy.get('.js-restore').first().click();
// The URL must no longer include the iframe parameter
cy.url().should('not.include', 'iframe=');
// The <body> should no longer have the "fullscreen" class
cy.get('body').should('not.have.class', 'fullscreen');
// And no <iframe> should remain inside <main>
cy.get('main iframe').should('not.exist');
});
});

View File

@@ -0,0 +1,130 @@
// cypress/e2e/dynamic_popup.spec.js
describe('Dynamic Popup', () => {
const base = {
name: 'Test Item',
identifier: 'ABC123',
description: 'A simple description',
warning: '**Be careful!**',
info: '_Some info_',
url: null,
iframe: false,
icon: { class: 'fa fa-test' },
alternatives: [
{ name: 'Alt One', identifier: 'ALT1', icon: { class: 'fa fa-alt1' } }
],
children: [
{ name: 'Child One', identifier: 'CH1', icon: { class: 'fa fa-child1' } }
]
};
beforeEach(() => {
cy.visit('/');
cy.window().then(win => {
cy.stub(win.navigator.clipboard, 'writeText').resolves();
cy.stub(win, 'alert');
});
});
function open(item = {}) {
cy.window().invoke('openDynamicPopup', { ...base, ...item });
}
it('renders title with icon and text', () => {
open();
cy.get('#dynamicModalLabel')
.find('i.fa.fa-test')
.should('exist');
cy.get('#dynamicModalLabel')
.should('contain.text', 'Test Item');
});
it('falls back to plain text when no icon', () => {
open({ icon: null });
cy.get('#dynamicModalLabel')
.find('i')
.should('not.exist');
cy.get('#dynamicModalLabel')
.should('have.text', 'Test Item');
});
it('shows identifier when provided and populates input', () => {
open();
cy.get('#dynamicIdentifierBox').should('not.have.class', 'd-none');
cy.get('#dynamicModalContent').should('have.value', 'ABC123');
});
it('hides identifier box when none', () => {
open({ identifier: null });
cy.get('#dynamicIdentifierBox').should('have.class', 'd-none');
cy.get('#dynamicModalContent').should('have.value', '');
});
it('renders warning and info via marked', () => {
open();
cy.get('#dynamicModalWarning')
.should('not.have.class', 'd-none')
.find('#dynamicModalWarningText')
.should('contain.html', '<strong>Be careful!</strong>');
cy.get('#dynamicModalInfo')
.should('not.have.class', 'd-none')
.find('#dynamicModalInfoText')
.should('contain.html', '<em>Some info</em>');
});
it('hides warning/info when none provided', () => {
open({ warning: null, info: null });
cy.get('#dynamicModalWarning').should('have.class', 'd-none');
cy.get('#dynamicModalInfo').should('have.class', 'd-none');
});
it('shows description when no URL', () => {
open({ url: null, description: 'Only desc' });
cy.get('#dynamicDescriptionText')
.should('not.have.class', 'd-none')
.and('have.text', 'Only desc');
cy.get('#dynamicModalLink').should('have.class', 'd-none');
});
it('shows link when URL is provided', () => {
open({ url: 'https://example.com', description: 'Click me' });
cy.get('#dynamicModalLink').should('not.have.class', 'd-none');
cy.get('#dynamicModalLinkHref')
.should('have.attr', 'href', 'https://example.com')
.and('have.text', 'Click me');
});
it('populates alternatives and children lists', () => {
open();
cy.get('#dynamicAlternativesSection').should('not.have.class', 'd-none');
cy.get('#dynamicAlternativesList li')
.should('have.length', 1)
.first().contains('Alt One');
cy.get('#dynamicChildrenSection').should('not.have.class', 'd-none');
cy.get('#dynamicChildrenList li')
.should('have.length', 1)
.first().contains('Child One');
});
it('hides sections when no items', () => {
open({ alternatives: [], children: [] });
cy.get('#dynamicAlternativesSection').should('have.class', 'd-none');
cy.get('#dynamicChildrenSection').should('have.class', 'd-none');
});
it('clicking an “Open” in list re-opens popup with that item', () => {
open();
cy.get('#dynamicAlternativesList button').click();
cy.get('#dynamicModalLabel')
.should('contain.text', 'Alt One');
});
it('copy button selects & copies identifier', () => {
open();
cy.get('#dynamicCopyButton').click();
cy.window().its('navigator.clipboard.writeText')
.should('have.been.calledWith', 'ABC123');
cy.window().its('alert')
.should('have.been.calledWith', 'Identifier copied to clipboard!');
});
});

View File

@@ -0,0 +1,130 @@
// cypress/e2e/dynamic_popup.spec.js
describe('Dynamic Popup', () => {
const base = {
name: 'Test Item',
identifier: 'ABC123',
description: 'A simple description',
warning: '**Be careful!**',
info: '_Some info_',
url: null,
iframe: false,
icon: { class: 'fa fa-test' },
alternatives: [
{ name: 'Alt One', identifier: 'ALT1', icon: { class: 'fa fa-alt1' } }
],
children: [
{ name: 'Child One', identifier: 'CH1', icon: { class: 'fa fa-child1' } }
]
};
beforeEach(() => {
cy.visit('/');
cy.window().then(win => {
cy.stub(win.navigator.clipboard, 'writeText').resolves();
cy.stub(win, 'alert');
});
});
function open(item = {}) {
cy.window().invoke('openDynamicPopup', { ...base, ...item });
}
it('renders title with icon and text', () => {
open();
cy.get('#dynamicModalLabel')
.find('i.fa.fa-test')
.should('exist');
cy.get('#dynamicModalLabel')
.should('contain.text', 'Test Item');
});
it('falls back to plain text when no icon', () => {
open({ icon: null });
cy.get('#dynamicModalLabel')
.find('i')
.should('not.exist');
cy.get('#dynamicModalLabel')
.should('have.text', 'Test Item');
});
it('shows identifier when provided and populates input', () => {
open();
cy.get('#dynamicIdentifierBox').should('not.have.class', 'd-none');
cy.get('#dynamicModalContent').should('have.value', 'ABC123');
});
it('hides identifier box when none', () => {
open({ identifier: null });
cy.get('#dynamicIdentifierBox').should('have.class', 'd-none');
cy.get('#dynamicModalContent').should('have.value', '');
});
it('renders warning and info via marked', () => {
open();
cy.get('#dynamicModalWarning')
.should('not.have.class', 'd-none')
.find('#dynamicModalWarningText')
.should('contain.html', '<strong>Be careful!</strong>');
cy.get('#dynamicModalInfo')
.should('not.have.class', 'd-none')
.find('#dynamicModalInfoText')
.should('contain.html', '<em>Some info</em>');
});
it('hides warning/info when none provided', () => {
open({ warning: null, info: null });
cy.get('#dynamicModalWarning').should('have.class', 'd-none');
cy.get('#dynamicModalInfo').should('have.class', 'd-none');
});
it('shows description when no URL', () => {
open({ url: null, description: 'Only desc' });
cy.get('#dynamicDescriptionText')
.should('not.have.class', 'd-none')
.and('have.text', 'Only desc');
cy.get('#dynamicModalLink').should('have.class', 'd-none');
});
it('shows link when URL is provided', () => {
open({ url: 'https://example.com', description: 'Click me' });
cy.get('#dynamicModalLink').should('not.have.class', 'd-none');
cy.get('#dynamicModalLinkHref')
.should('have.attr', 'href', 'https://example.com')
.and('have.text', 'Click me');
});
it('populates alternatives and children lists', () => {
open();
cy.get('#dynamicAlternativesSection').should('not.have.class', 'd-none');
cy.get('#dynamicAlternativesList li')
.should('have.length', 1)
.first().contains('Alt One');
cy.get('#dynamicChildrenSection').should('not.have.class', 'd-none');
cy.get('#dynamicChildrenList li')
.should('have.length', 1)
.first().contains('Child One');
});
it('hides sections when no items', () => {
open({ alternatives: [], children: [] });
cy.get('#dynamicAlternativesSection').should('have.class', 'd-none');
cy.get('#dynamicChildrenSection').should('have.class', 'd-none');
});
it('clicking an “Open” in list re-opens popup with that item', () => {
open();
cy.get('#dynamicAlternativesList button').click();
cy.get('#dynamicModalLabel')
.should('contain.text', 'Alt One');
});
it('copy button selects & copies identifier', () => {
open();
cy.get('#dynamicCopyButton').click();
cy.window().its('navigator.clipboard.writeText')
.should('have.been.calledWith', 'ABC123');
cy.window().its('alert')
.should('have.been.calledWith', 'Identifier copied to clipboard!');
});
});

View File

@@ -0,0 +1,32 @@
describe('Navbar Logo Visibility', () => {
beforeEach(() => {
cy.visit('/');
});
it('should have #navbar_logo present in the DOM', () => {
cy.get('#navbar_logo').should('exist');
});
it('should be invisible (opacity 0) by default', () => {
cy.get('#navbar_logo')
.should('exist')
.and('have.css', 'opacity', '0');
});
it('should become visible (opacity 1) after entering fullscreen', () => {
cy.window().then(win => {
win.enterFullscreen();
});
cy.get('#navbar_logo', { timeout: 4000 })
.should('have.css', 'opacity', '1');
});
it('should become invisible again (opacity 0) after exiting fullscreen', () => {
cy.window().then(win => {
win.enterFullscreen();
win.exitFullscreen();
});
cy.get('#navbar_logo', { timeout: 4000 })
.should('have.css', 'opacity', '0');
});
});

View File

@@ -0,0 +1,130 @@
// cypress/e2e/dynamic_popup.spec.js
describe('Dynamic Popup', () => {
const base = {
name: 'Test Item',
identifier: 'ABC123',
description: 'A simple description',
warning: '**Be careful!**',
info: '_Some info_',
url: null,
iframe: false,
icon: { class: 'fa fa-test' },
alternatives: [
{ name: 'Alt One', identifier: 'ALT1', icon: { class: 'fa fa-alt1' } }
],
children: [
{ name: 'Child One', identifier: 'CH1', icon: { class: 'fa fa-child1' } }
]
};
beforeEach(() => {
cy.visit('/');
cy.window().then(win => {
cy.stub(win.navigator.clipboard, 'writeText').resolves();
cy.stub(win, 'alert');
});
});
function open(item = {}) {
cy.window().invoke('openDynamicPopup', { ...base, ...item });
}
it('renders title with icon and text', () => {
open();
cy.get('#dynamicModalLabel')
.find('i.fa.fa-test')
.should('exist');
cy.get('#dynamicModalLabel')
.should('contain.text', 'Test Item');
});
it('falls back to plain text when no icon', () => {
open({ icon: null });
cy.get('#dynamicModalLabel')
.find('i')
.should('not.exist');
cy.get('#dynamicModalLabel')
.should('have.text', 'Test Item');
});
it('shows identifier when provided and populates input', () => {
open();
cy.get('#dynamicIdentifierBox').should('not.have.class', 'd-none');
cy.get('#dynamicModalContent').should('have.value', 'ABC123');
});
it('hides identifier box when none', () => {
open({ identifier: null });
cy.get('#dynamicIdentifierBox').should('have.class', 'd-none');
cy.get('#dynamicModalContent').should('have.value', '');
});
it('renders warning and info via marked', () => {
open();
cy.get('#dynamicModalWarning')
.should('not.have.class', 'd-none')
.find('#dynamicModalWarningText')
.should('contain.html', '<strong>Be careful!</strong>');
cy.get('#dynamicModalInfo')
.should('not.have.class', 'd-none')
.find('#dynamicModalInfoText')
.should('contain.html', '<em>Some info</em>');
});
it('hides warning/info when none provided', () => {
open({ warning: null, info: null });
cy.get('#dynamicModalWarning').should('have.class', 'd-none');
cy.get('#dynamicModalInfo').should('have.class', 'd-none');
});
it('shows description when no URL', () => {
open({ url: null, description: 'Only desc' });
cy.get('#dynamicDescriptionText')
.should('not.have.class', 'd-none')
.and('have.text', 'Only desc');
cy.get('#dynamicModalLink').should('have.class', 'd-none');
});
it('shows link when URL is provided', () => {
open({ url: 'https://example.com', description: 'Click me' });
cy.get('#dynamicModalLink').should('not.have.class', 'd-none');
cy.get('#dynamicModalLinkHref')
.should('have.attr', 'href', 'https://example.com')
.and('have.text', 'Click me');
});
it('populates alternatives and children lists', () => {
open();
cy.get('#dynamicAlternativesSection').should('not.have.class', 'd-none');
cy.get('#dynamicAlternativesList li')
.should('have.length', 1)
.first().contains('Alt One');
cy.get('#dynamicChildrenSection').should('not.have.class', 'd-none');
cy.get('#dynamicChildrenList li')
.should('have.length', 1)
.first().contains('Child One');
});
it('hides sections when no items', () => {
open({ alternatives: [], children: [] });
cy.get('#dynamicAlternativesSection').should('have.class', 'd-none');
cy.get('#dynamicChildrenSection').should('have.class', 'd-none');
});
it('clicking an “Open” in list re-opens popup with that item', () => {
open();
cy.get('#dynamicAlternativesList button').click();
cy.get('#dynamicModalLabel')
.should('contain.text', 'Alt One');
});
it('copy button selects & copies identifier', () => {
open();
cy.get('#dynamicCopyButton').click();
cy.window().its('navigator.clipboard.writeText')
.should('have.been.calledWith', 'ABC123');
cy.window().its('alert')
.should('have.been.calledWith', 'Identifier copied to clipboard!');
});
});

16
app/package.json Normal file
View File

@@ -0,0 +1,16 @@
{
"dependencies": {
"@fortawesome/fontawesome-free": "^6.7.2",
"bootstrap": "5.2.2",
"bootstrap-icons": "1.9.1",
"jquery": "3.6.0",
"marked": "^4.3.0"
},
"devDependencies": {
"cypress": "^14.5.1"
},
"scripts": {
"build": "node scripts/copy-vendor.js",
"postinstall": "node scripts/copy-vendor.js"
}
}

View File

@@ -1,3 +0,0 @@
flask
requests
pyyaml

View File

@@ -0,0 +1,71 @@
'use strict';
/**
* Copies third-party browser assets from node_modules into static/vendor/
* so Flask can serve them without any CDN dependency.
* Runs automatically via the "postinstall" npm hook.
*/
const fs = require('fs');
const path = require('path');
const NM = path.join(__dirname, '..', 'node_modules');
const VENDOR = path.join(__dirname, '..', 'static', 'vendor');
function copyFile(src, dest) {
fs.mkdirSync(path.dirname(dest), { recursive: true });
fs.copyFileSync(src, dest);
}
function copyDir(src, dest) {
fs.mkdirSync(dest, { recursive: true });
for (const entry of fs.readdirSync(src, { withFileTypes: true })) {
const s = path.join(src, entry.name);
const d = path.join(dest, entry.name);
entry.isDirectory() ? copyDir(s, d) : fs.copyFileSync(s, d);
}
}
// Bootstrap CSS + JS bundle
copyFile(
path.join(NM, 'bootstrap', 'dist', 'css', 'bootstrap.min.css'),
path.join(VENDOR, 'bootstrap', 'css', 'bootstrap.min.css')
);
copyFile(
path.join(NM, 'bootstrap', 'dist', 'js', 'bootstrap.bundle.min.js'),
path.join(VENDOR, 'bootstrap', 'js', 'bootstrap.bundle.min.js')
);
// Bootstrap Icons CSS + embedded fonts
copyFile(
path.join(NM, 'bootstrap-icons', 'font', 'bootstrap-icons.css'),
path.join(VENDOR, 'bootstrap-icons', 'font', 'bootstrap-icons.css')
);
copyDir(
path.join(NM, 'bootstrap-icons', 'font', 'fonts'),
path.join(VENDOR, 'bootstrap-icons', 'font', 'fonts')
);
// Font Awesome Free CSS + webfonts
copyFile(
path.join(NM, '@fortawesome', 'fontawesome-free', 'css', 'all.min.css'),
path.join(VENDOR, 'fontawesome', 'css', 'all.min.css')
);
copyDir(
path.join(NM, '@fortawesome', 'fontawesome-free', 'webfonts'),
path.join(VENDOR, 'fontawesome', 'webfonts')
);
// marked browser UMD build (path varies by version)
const markedCandidates = [
path.join(NM, 'marked', 'marked.min.js'), // v4.x
path.join(NM, 'marked', 'lib', 'marked.umd.min.js'), // v5.x
path.join(NM, 'marked', 'dist', 'marked.min.js'), // v9+
];
const markedSrc = markedCandidates.find(p => fs.existsSync(p));
if (!markedSrc) throw new Error('marked: no browser UMD build found in node_modules');
copyFile(markedSrc, path.join(VENDOR, 'marked', 'marked.min.js'));
// jQuery
copyFile(
path.join(NM, 'jquery', 'dist', 'jquery.min.js'),
path.join(VENDOR, 'jquery', 'jquery.min.js')
);

View File

@@ -111,6 +111,19 @@ div#navbarNavfooter li.nav-item {
margin-right: 6px;
}
/* Prevent nav items from wrapping to a second line */
div#navbarNavheader .navbar-nav,
div#navbarNavfooter .navbar-nav {
flex-wrap: nowrap;
overflow-x: auto;
scrollbar-width: none; /* Firefox */
}
div#navbarNavheader .navbar-nav::-webkit-scrollbar,
div#navbarNavfooter .navbar-nav::-webkit-scrollbar {
display: none; /* Chrome/Safari */
}
main, footer, header, nav {
position: relative;
box-shadow:
@@ -168,11 +181,19 @@ iframe{
}
#navbar_logo {
/* start invisible but in the layout (d-none will actually hide it) */
opacity: 0;
transition: opacity var(--anim-duration) ease-in-out;
max-width: 0;
overflow: hidden;
transition: opacity var(--anim-duration) ease-in-out,
max-width var(--anim-duration) ease-in-out;
}
#navbar_logo.visible {
opacity: 1 !important;
max-width: 300px;
}
/* 1. Make sure headers and footers can collapse */
header,
footer {

View File

@@ -41,14 +41,11 @@ function enterFullscreen() {
setFullWidth(true);
updateUrlFullscreen(true);
// fade in logo… (unchanged)
// Nur jetzt sichtbar machen
const logo = document.getElementById('navbar_logo');
if (logo) {
logo.classList.remove('d-none');
requestAnimationFrame(() => logo.style.opacity = '1');
logo.classList.add('visible');
}
// now recalc in lock-step with the CSS collapse animation
recalcWhileCollapsing();
}
@@ -57,19 +54,11 @@ function exitFullscreen() {
setFullWidth(false);
updateUrlFullscreen(false);
// fade out logo… (unchanged)
// Jetzt wieder verstecken
const logo = document.getElementById('navbar_logo');
if (logo) {
logo.style.opacity = '0';
logo.addEventListener('transitionend', function handler(e) {
if (e.propertyName === 'opacity') {
logo.classList.add('d-none');
logo.removeEventListener('transitionend', handler);
logo.classList.remove('visible');
}
});
}
// recalc while header/footer expand back
recalcWhileCollapsing();
}

View File

@@ -1,5 +1,16 @@
// Global variables to store elements and original state
let mainElement, originalContent, originalMainStyle, container, customScrollbar, scrollbarContainer;
let currentIframeUrl = null;
// === Auto-open iframe if URL parameter is present ===
window.addEventListener('DOMContentLoaded', () => {
const paramUrl = new URLSearchParams(window.location.search).get('iframe');
if (paramUrl) {
currentIframeUrl = paramUrl;
enterFullscreen();
openIframe(paramUrl);
}
});
// Synchronize the height of the iframe to match the scroll-container or main element
function syncIframeHeight() {
@@ -23,8 +34,6 @@ function syncIframeHeight() {
// Function to open a URL in an iframe (jQuery version mit 1500 ms Fade)
function openIframe(url) {
enterFullscreen();
var $container = scrollbarContainer ? $(scrollbarContainer) : null;
var $customScroll = customScrollbar ? $(customScrollbar) : null;
var $main = $(mainElement);
@@ -35,7 +44,9 @@ function openIframe(url) {
if ($customScroll) promises.push($customScroll.fadeOut(1500).promise());
$.when.apply($, promises).done(function() {
// Iframe anlegen, falls noch nicht vorhanden
// now that scroll areas are hidden, go fullscreen
enterFullscreen();
// create iframe if it doesnt exist yet
var $iframe = $main.find('iframe');
if ($iframe.length === 0) {
originalMainStyle = $main.attr('style') || null;
@@ -62,36 +73,31 @@ function openIframe(url) {
});
}
// Function to restore the original content (jQuery version mit 1500 ms Fade)
/**
* Restore the original <main> content and exit fullscreen.
*/
function restoreOriginal() {
var $main = $(mainElement);
var $iframe = $main.find('iframe');
var $container = scrollbarContainer ? $(scrollbarContainer) : null;
var $customScroll = customScrollbar ? $(customScrollbar) : null;
// Exit fullscreen (collapse header/footer and run recalcs)
exitFullscreen();
if ($iframe.length) {
// Iframe mit 1500 ms ausblenden, dann entfernen und Original einblenden
$iframe.fadeOut(1500, function() {
$iframe.remove();
// Replace <main> innerHTML with the snapshot we took on load
mainElement.innerHTML = originalContent;
if ($container) $container.fadeIn(1500);
if ($customScroll) $customScroll.fadeIn(1500);
// Inline-Style des main-Elements zurücksetzen
// Reset any inline styles on mainElement
if (originalMainStyle !== null) {
$main.attr('style', originalMainStyle);
mainElement.setAttribute('style', originalMainStyle);
} else {
$main.removeAttr('style');
mainElement.removeAttribute('style');
}
// URL-Parameter entfernen
var newUrl = new URL(window.location);
newUrl.searchParams.delete('iframe');
window.history.pushState({}, '', newUrl);
});
}
}
// Re-run height adjustments for scroll container & thumb
adjustScrollContainerHeight();
updateCustomScrollbar();
// Clear iframe state and URL param
currentIframeUrl = null;
history.replaceState(null, '', window.location.pathname);
}
// Initialize event listeners after DOM content is loaded
document.addEventListener("DOMContentLoaded", function() {
@@ -103,41 +109,25 @@ document.addEventListener("DOMContentLoaded", function() {
customScrollbar = document.getElementById("custom-scrollbar");
scrollbarContainer = container.querySelector(".scroll-container")
// Attach click handlers to links that should open in an iframe
document.querySelectorAll(".iframe-link").forEach(link => {
link.addEventListener("click", function(event) {
event.preventDefault(); // prevent full page navigation
openIframe(this.href);
updateUrlFullWidth(true);
});
});
document.querySelectorAll(".js-restore").forEach(el => {
el.style.cursor = "pointer";
el.addEventListener("click", restoreOriginal);
});
// On full page load, check URL parameters to auto-open an iframe
window.addEventListener("load", function() {
const params = new URLSearchParams(window.location.search);
const iframeUrl = params.get('iframe');
if (iframeUrl) {
openIframe(iframeUrl);
}
});
});
// Handle browser back/forward navigation
window.addEventListener('popstate', function(event) {
const params = new URLSearchParams(window.location.search);
const iframeUrl = params.get('iframe');
if (iframeUrl) {
openIframe(iframeUrl);
} else {
// === Close iframe & exit fullscreen when any .js-restore is clicked ===
document.querySelectorAll('.js-restore').forEach(el => {
el.style.cursor = 'pointer';
el.addEventListener('click', () => {
// first collapse header/footer and recalc container
exitFullscreen();
// then fade out and remove the iframe, fade content back
restoreOriginal();
}
// clear stored URL and reset browser address
currentIframeUrl = null;
history.replaceState(null, '', window.location.pathname);
});
});
});
/**
@@ -181,3 +171,19 @@ function observeIframeNavigation() {
}
}, 500);
}
// Remember, open iframe, enter fullscreen, AND set the URL param immediately
document.querySelectorAll(".iframe-link").forEach(link => {
link.addEventListener("click", function(event) {
event.preventDefault();
currentIframeUrl = this.href;
enterFullscreen();
openIframe(currentIframeUrl);
// Update the browser URL right away
const newUrl = new URL(window.location);
newUrl.searchParams.set('iframe', currentIframeUrl);
window.history.replaceState({ iframe: currentIframeUrl }, '', newUrl);
});
});

View File

@@ -9,22 +9,19 @@
href="{% if platform.favicon.cache %}{{ url_for('static', filename=platform.favicon.cache) }}{% endif %}"
>
<!-- Bootstrap CSS only -->
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.2.2/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-Zenh87qX5JnK2Jl0vWa8Ck2rdkQ2Bzep5IDxbcnCeuOxjzrPF/et3URy9Bv1WTRi" crossorigin="anonymous">
<link href="{{ url_for('static', filename='vendor/bootstrap/css/bootstrap.min.css') }}" rel="stylesheet">
<!-- Bootstrap JavaScript Bundle with Popper -->
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.2.2/dist/js/bootstrap.bundle.min.js" integrity="sha384-OERcA2EqjJCMA+/3y+gxIOqMEjwtxJY7qPCqsdltbNJuaOe923+mo//f6V8Qbsw3" crossorigin="anonymous"></script>
<script src="{{ url_for('static', filename='vendor/bootstrap/js/bootstrap.bundle.min.js') }}"></script>
<!-- Bootstrap Icons -->
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.9.1/font/bootstrap-icons.css">
<link rel="stylesheet" href="{{ url_for('static', filename='vendor/bootstrap-icons/font/bootstrap-icons.css') }}">
<!-- Fontawesome -->
<script src="https://kit.fontawesome.com/56f96da298.js" crossorigin="anonymous"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='vendor/fontawesome/css/all.min.css') }}">
<!-- Markdown -->
<script src="https://cdn.jsdelivr.net/npm/marked/marked.min.js"></script>
<script src="{{ url_for('static', filename='vendor/marked/marked.min.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/default.css') }}">
<link rel="stylesheet" href="{{ url_for('static', filename='css/custom_scrollbar.css') }}">
<!-- JQuery -->
<script
src="https://code.jquery.com/jquery-3.6.0.min.js"
crossorigin="anonymous">
</script>
<script src="{{ url_for('static', filename='vendor/jquery/jquery.min.js') }}"></script>
</head>
<body
{% if apod_bg %}

View File

@@ -54,7 +54,7 @@
</button>
<div class="collapse navbar-collapse" id="navbarNav{{menu_type}}">
{% if menu_type == "header" %}
<a class="navbar-brand d-flex align-items-center d-none js-restore" id="navbar_logo" href="#">
<a class="navbar-brand align-items-center d-flex js-restore" id="navbar_logo" href="#">
<img
src="{{ url_for('static', filename=platform.logo.cache) }}"
alt="{{ platform.titel }}"

1
app/utils/__init__.py Normal file
View File

@@ -0,0 +1 @@
"""Utilities used by the Portfolio UI web application."""

View File

@@ -1,7 +1,9 @@
import os
import hashlib
import requests
import mimetypes
import os
import requests
class CacheManager:
def __init__(self, cache_dir="static/cache"):
@@ -9,8 +11,7 @@ class CacheManager:
self._ensure_cache_dir_exists()
def _ensure_cache_dir_exists(self):
if not os.path.exists(self.cache_dir):
os.makedirs(self.cache_dir)
os.makedirs(self.cache_dir, exist_ok=True)
def clear_cache(self):
if os.path.exists(self.cache_dir):
@@ -20,8 +21,10 @@ class CacheManager:
os.remove(path)
def cache_file(self, file_url):
# generate a short hash for filename
hash_suffix = hashlib.blake2s(file_url.encode('utf-8'), digest_size=8).hexdigest()
hash_suffix = hashlib.blake2s(
file_url.encode("utf-8"),
digest_size=8,
).hexdigest()
parts = file_url.rstrip("/").split("/")
base = parts[-2] if parts[-1] == "download" else parts[-1]
@@ -31,7 +34,7 @@ class CacheManager:
except requests.RequestException:
return None
content_type = resp.headers.get('Content-Type', '')
content_type = resp.headers.get("Content-Type", "")
ext = mimetypes.guess_extension(content_type.split(";")[0].strip()) or ".png"
filename = f"{base}_{hash_suffix}{ext}"
full_path = os.path.join(self.cache_dir, filename)
@@ -41,5 +44,4 @@ class CacheManager:
for chunk in resp.iter_content(1024):
f.write(chunk)
# return path relative to /static/
return f"cache/{filename}"

View File

@@ -32,7 +32,7 @@ def compute_card_classes(cards):
lg_classes.append("col-lg-6")
else:
lg_classes.append("col-lg-4")
# md classes: If the number of cards is even or if not the last card, otherwise "col-md-12"
# Use a full-width last card on medium screens only when the total count is odd.
md_classes = []
for i in range(num_cards):
if num_cards % 2 == 0 or i < num_cards - 1:

View File

@@ -1,4 +1,3 @@
from pprint import pprint
class ConfigurationResolver:
"""
A class to resolve `link` entries in a nested configuration structure.
@@ -14,19 +13,6 @@ class ConfigurationResolver:
"""
self._recursive_resolve(self.config, self.config)
def __load_children(self,path):
"""
Check if explicitly children should be loaded and not parent
"""
return path.split('.').pop() == "children"
def _replace_in_dict_by_dict(self, dict_origine, old_key, new_dict):
if old_key in dict_origine:
# Entferne den alten Key
old_value = dict_origine.pop(old_key)
# Füge die neuen Key-Value-Paare hinzu
dict_origine.update(new_dict)
def _replace_in_list_by_list(self, list_origine, old_element, new_elements):
index = list_origine.index(old_element)
list_origine[index : index + 1] = new_elements
@@ -43,10 +29,17 @@ class ConfigurationResolver:
for key, value in list(current_config.items()):
if key == "children":
if value is None or not isinstance(value, list):
raise ValueError(f"Expected 'children' to be a list, but got {type(value).__name__} instead.")
raise ValueError(
"Expected 'children' to be a list, but got "
f"{type(value).__name__} instead."
)
for item in value:
if "link" in item:
loaded_link = self._find_entry(root_config, self._mapped_key(item['link']), False)
loaded_link = self._find_entry(
root_config,
self._mapped_key(item["link"]),
False,
)
if isinstance(loaded_link, list):
self._replace_in_list_by_list(value, item, loaded_link)
else:
@@ -55,15 +48,24 @@ class ConfigurationResolver:
self._recursive_resolve(value, root_config)
elif key == "link":
try:
loaded = self._find_entry(root_config, self._mapped_key(value), False)
loaded = self._find_entry(
root_config, self._mapped_key(value), False
)
if isinstance(loaded, list) and len(loaded) > 2:
loaded = self._find_entry(root_config, self._mapped_key(value), False)
loaded = self._find_entry(
root_config, self._mapped_key(value), False
)
current_config.clear()
current_config.update(loaded)
except Exception as e:
raise ValueError(
f"Error resolving link '{value}': {str(e)}. "
f"Current part: {key}, Current config: {current_config}" + (f", Loaded: {loaded}" if 'loaded' in locals() or 'loaded' in globals() else "")
f"Current part: {key}, Current config: {current_config}"
+ (
f", Loaded: {loaded}"
if "loaded" in locals() or "loaded" in globals()
else ""
)
)
else:
self._recursive_resolve(value, root_config)
@@ -72,7 +74,9 @@ class ConfigurationResolver:
self._recursive_resolve(item, root_config)
def _get_children(self, current):
if isinstance(current, dict) and ("children" in current and current["children"]):
if isinstance(current, dict) and (
"children" in current and current["children"]
):
current = current["children"]
return current
@@ -81,8 +85,13 @@ class ConfigurationResolver:
def _find_by_name(self, current, part):
return next(
(item for item in current if isinstance(item, dict) and self._mapped_key(item.get("name", "")) == part),
None
(
item
for item in current
if isinstance(item, dict)
and self._mapped_key(item.get("name", "")) == part
),
None,
)
def _find_entry(self, config, path, children):
@@ -90,46 +99,44 @@ class ConfigurationResolver:
Finds an entry in the configuration by a dot-separated path.
Supports both dictionaries and lists with `children` navigation.
"""
parts = path.split('.')
parts = path.split(".")
current = config
for part in parts:
if isinstance(current, list):
# If children explicit declared just load children
if part != "children":
# Look for a matching name in the list
found = self._find_by_name(current, part)
if found:
current = found
print(
f"Matching entry for '{part}' in list. Path so far: {' > '.join(parts[:parts.index(part)+1])}. "
f"Matching entry for '{part}' in list. Path so far: "
f"{' > '.join(parts[: parts.index(part) + 1])}. "
f"Current list: {current}"
)
else:
raise ValueError(
f"No matching entry for '{part}' in list. Path so far: {' > '.join(parts[:parts.index(part)+1])}. "
f"No matching entry for '{part}' in list. Path so far: "
f"{' > '.join(parts[: parts.index(part) + 1])}. "
f"Current list: {current}"
)
elif isinstance(current, dict):
# Case-insensitive dictionary lookup
key = next((k for k in current if self._mapped_key(k) == part), None)
# If no fitting key was found search in the children
if key is None:
if "children" not in current:
raise KeyError(
f"No 'children' found in current dictionary. Path so far: {' > '.join(parts[:parts.index(part)+1])}. "
"No 'children' found in current dictionary. Path so far: "
f"{' > '.join(parts[: parts.index(part) + 1])}. "
f"Current dictionary: {current}"
)
# The following line seems buggy; Why is children loaded allways and not just when children is set?
current = self._find_by_name(current["children"], part)
if not current:
raise KeyError(
f"Key '{part}' not found in dictionary. Path so far: {' > '.join(parts[:parts.index(part)+1])}. "
f"Key '{part}' not found in dictionary. Path so far: "
f"{' > '.join(parts[: parts.index(part) + 1])}. "
f"Current dictionary: {current}"
)
else:
current = current[key]
else:
raise ValueError(
f"Invalid path segment '{part}'. Current type: {type(current)}. "

View File

@@ -5,14 +5,11 @@ services:
build:
context: .
dockerfile: Dockerfile
image: application-portfolio
container_name: portfolio
ports:
- "${PORT:-5000}:${PORT:-5000}"
env_file:
- .env
volumes:
- ./app:/app
- ./.env:/app./.env
environment:
- PORT=${PORT:-5000}
- FLASK_ENV=${FLASK_ENV:-production}
restart: unless-stopped

View File

@@ -1,2 +1,2 @@
PORT=5000
PORT=5001
FLASK_ENV=production

303
main.py
View File

@@ -1,300 +1,79 @@
#!/usr/bin/env python3
"""
main.py - A CLI tool for managing the Portfolio CMS Docker application.
This script provides commands to build and run the Docker container for the
portfolio application. It mimics the functionality of a Makefile with additional
explanatory text using argparse.
Commands:
build - Build the Docker image.
up - Start the application using docker-compose (with build).
down - Stop and remove the running container.
run-dev - Run the container in development mode (with hot-reloading).
run-prod - Run the container in production mode.
logs - Display the logs of the running container.
dev - Start the application in development mode using docker-compose.
prod - Start the application in production mode using docker-compose.
cleanup - Remove all stopped containers.
main.py - Proxy to Makefile targets for managing the Portfolio CMS Docker application.
Automatically generates CLI commands based on the Makefile definitions.
"""
import argparse
import re
import subprocess
import sys
import os
from dotenv import load_dotenv
from pathlib import Path
dotenv_path = Path(__file__).resolve().parent / ".env"
MAKEFILE_PATH = Path(__file__).resolve().parent / "Makefile"
if dotenv_path.exists():
load_dotenv(dotenv_path)
else:
print(f"⚠️ Warning: No .env file found at {dotenv_path}")
PORT = int(os.getenv("PORT", 5000))
def run_command(command, dry_run=False, env=None):
"""Utility function to run a shell command."""
def load_targets(makefile_path):
"""
Parse the Makefile to extract targets and their help comments.
Assumes each target is defined as 'name:' and the following line that starts
with '\t#' provides its help text.
"""
targets = []
pattern = re.compile(r"^([A-Za-z0-9_\-]+):")
with open(makefile_path, "r", encoding="utf-8") as handle:
lines = handle.readlines()
for idx, line in enumerate(lines):
m = pattern.match(line)
if m:
name = m.group(1)
help_text = ""
if idx + 1 < len(lines):
next_line = lines[idx + 1].lstrip()
if next_line.startswith("#"):
help_text = next_line.lstrip("# ").strip()
targets.append((name, help_text))
return targets
def run_command(command, dry_run=False):
"""Utility to run shell commands."""
print(f"Executing: {' '.join(command)}")
if dry_run:
print("Dry run enabled: command not executed.")
return
try:
subprocess.check_call(command, env=env)
subprocess.check_call(command)
except subprocess.CalledProcessError as e:
print(f"Error: Command failed with exit code {e.returncode}")
sys.exit(e.returncode)
def build(args):
"""
Build the Docker image for the portfolio application.
Command:
docker build -t application-portfolio .
This command creates a Docker image named 'application-portfolio'
from the Dockerfile in the current directory.
"""
command = ["docker", "build", "-t", "application-portfolio", "."]
run_command(command, args.dry_run)
def up(args):
"""
Start the application using docker-compose with build.
Command:
docker-compose up --build
This command uses docker-compose to build (if necessary) and start
all defined services. It is useful for quickly starting your
development or production environment.
"""
command = ["docker-compose", "up", "--build"]
run_command(command, args.dry_run)
def down(args):
"""
Stop and remove the Docker container named 'portfolio'.
Commands:
docker stop portfolio
docker rm portfolio
These commands stop the running container and remove it from your Docker host.
The '-' prefix is used to ignore errors if the container is not running.
"""
command_stop = ["docker", "stop", "portfolio"]
command_rm = ["docker", "rm", "portfolio"]
run_command(command_stop, args.dry_run)
run_command(command_rm, args.dry_run)
def run_dev(args):
"""
Run the container in development mode with hot-reloading.
Command:
docker run -d -p 5000:5000 --name portfolio -v $(pwd)/app/:/app \
-e FLASK_APP=app.py -e FLASK_ENV=development application-portfolio
This command starts the container in detached mode (-d), maps port 5000,
mounts the local 'app/' directory into the container, and sets environment
variables to enable Flask's development mode.
"""
current_dir = os.getcwd()
volume_mapping = f"{current_dir}/app/:/app"
command = [
"docker", "run", "-d",
"-p", f"{PORT}:{PORT}",
"--name", "portfolio",
"-v", volume_mapping,
"-e", "FLASK_APP=app.py",
"-e", "FLASK_ENV=development",
"application-portfolio"
]
run_command(command, args.dry_run)
def run_prod(args):
"""
Run the container in production mode.
Command:
docker run -d -p 5000:5000 --name portfolio application-portfolio
This command starts the container in detached mode, mapping port 5000,
and runs the production version of the portfolio application.
"""
command = [
"docker", "run", "-d",
"-p", "{PORT}:5000",
"--name", "portfolio",
"application-portfolio"
]
run_command(command, args.dry_run)
def logs(args):
"""
Display the logs of the 'portfolio' container.
Command:
docker logs -f portfolio
This command follows the logs (using -f) of the running container,
which is helpful for debugging and monitoring.
"""
command = ["docker", "logs", "-f", "portfolio"]
run_command(command, args.dry_run)
def dev(args):
"""
Run the application in development mode using docker-compose.
"""
env = os.environ.copy()
env["FLASK_ENV"] = "development"
command = ["docker-compose", "up", "-d"]
print("▶️ Starting in development mode (FLASK_ENV=development)")
run_command(command, args.dry_run, env=env)
def prod(args):
"""
Run the application in production mode using docker-compose.
Command:
docker-compose up --build
This command builds the Docker image if needed and starts the application
using docker-compose for a production environment.
"""
command = ["docker-compose", "up", "--build"]
run_command(command, args.dry_run)
def cleanup(args):
"""
Remove all stopped Docker containers.
Command:
docker container prune -f
This command cleans up your Docker environment by forcefully removing
all stopped containers. It is useful to reclaim disk space and remove
unused containers.
"""
command = ["docker", "container", "prune", "-f"]
run_command(command, args.dry_run)
def delete_portfolio_container(dry_run=False):
"""
Force remove the portfolio container if it exists.
"""
print("Checking if 'portfolio' container exists to delete...")
command = ["docker", "rm", "-f", "portfolio"]
run_command(command, dry_run)
def browse(args):
"""
Open http://localhost:5000 in Chromium browser.
Command:
chromium http://localhost:5000
This command launches the Chromium browser to view the running application.
"""
command = ["chromium", f"http://localhost:{PORT}"]
run_command(command, args.dry_run)
def main():
parser = argparse.ArgumentParser(
description="CLI tool to manage the Portfolio CMS Docker application."
description="CLI proxy to Makefile targets for Portfolio CMS Docker app"
)
parser.add_argument(
"--dry-run",
action="store_true",
help="Print the commands without executing them."
)
parser.add_argument(
"--delete",
action="store_true",
help="Delete the existing 'portfolio' container before running the command."
help="Print the generated Make command without executing it.",
)
subparsers = parser.add_subparsers(
title="Commands",
description="Available commands to manage the application",
dest="command"
title="Available commands",
dest="command",
required=True,
)
# Browse command
parser_browse = subparsers.add_parser(
"browse", help="Open application in Chromium browser."
)
parser_browse.set_defaults(func=browse)
# Build command
parser_build = subparsers.add_parser(
"build", help="Build the Docker image."
)
parser_build.set_defaults(func=build)
# Up command (docker-compose up)
parser_up = subparsers.add_parser(
"up", help="Start the application using docker-compose (with build)."
)
parser_up.set_defaults(func=up)
# Down command
parser_down = subparsers.add_parser(
"down", help="Stop and remove the Docker container."
)
parser_down.set_defaults(func=down)
# Run-dev command
parser_run_dev = subparsers.add_parser(
"run-dev", help="Run the container in development mode (with hot-reloading)."
)
parser_run_dev.set_defaults(func=run_dev)
# Run-prod command
parser_run_prod = subparsers.add_parser(
"run-prod", help="Run the container in production mode."
)
parser_run_prod.set_defaults(func=run_prod)
# Logs command
parser_logs = subparsers.add_parser(
"logs", help="Display the logs of the running container."
)
parser_logs.set_defaults(func=logs)
# Dev command (docker-compose with FLASK_ENV)
parser_dev = subparsers.add_parser(
"dev", help="Start the application in development mode using docker-compose."
)
parser_dev.set_defaults(func=dev)
# Prod command (docker-compose production)
parser_prod = subparsers.add_parser(
"prod", help="Start the application in production mode using docker-compose."
)
parser_prod.set_defaults(func=prod)
# Cleanup command
parser_cleanup = subparsers.add_parser(
"cleanup", help="Remove all stopped Docker containers."
)
parser_cleanup.set_defaults(func=cleanup)
targets = load_targets(MAKEFILE_PATH)
for name, help_text in targets:
sp = subparsers.add_parser(name, help=help_text)
sp.set_defaults(target=name)
args = parser.parse_args()
cmd = ["make", args.target]
run_command(cmd, dry_run=args.dry_run)
if args.command is None:
parser.print_help()
sys.exit(1)
if args.delete:
delete_portfolio_container(args.dry_run)
# Execute the chosen subcommand function
args.func(args)
if __name__ == "__main__":
main()

44
pyproject.toml Normal file
View File

@@ -0,0 +1,44 @@
[build-system]
requires = ["setuptools>=69"]
build-backend = "setuptools.build_meta"
[project]
name = "portfolio-ui"
version = "1.1.0"
description = "A lightweight YAML-driven portfolio and landing-page generator."
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"flask",
"pyyaml",
"requests",
]
[project.optional-dependencies]
dev = [
"bandit",
"pip-audit",
"ruff",
]
[tool.setuptools]
py-modules = ["main"]
[tool.setuptools.packages.find]
include = ["app", "app.*"]
[tool.setuptools.package-data]
app = [
"config.sample.yaml",
"templates/**/*.j2",
"static/css/*.css",
"static/js/*.js",
]
[tool.ruff]
target-version = "py312"
line-length = 88
extend-exclude = ["app/static/cache", "build"]
[tool.ruff.lint]
select = ["E", "F", "I"]

View File

@@ -1 +0,0 @@
python-dotenv

1
tests/__init__.py Normal file
View File

@@ -0,0 +1 @@

View File

@@ -0,0 +1 @@

View File

@@ -0,0 +1,54 @@
import tomllib
import unittest
from pathlib import Path
class TestPythonPackaging(unittest.TestCase):
def setUp(self) -> None:
self.repo_root = Path(__file__).resolve().parents[2]
self.pyproject_path = self.repo_root / "pyproject.toml"
with self.pyproject_path.open("rb") as handle:
self.pyproject = tomllib.load(handle)
def test_pyproject_defines_build_system_and_runtime_dependencies(self):
build_system = self.pyproject["build-system"]
project = self.pyproject["project"]
self.assertEqual(build_system["build-backend"], "setuptools.build_meta")
self.assertIn("setuptools>=69", build_system["requires"])
self.assertGreaterEqual(
set(project["dependencies"]),
{"flask", "pyyaml", "requests"},
)
self.assertEqual(project["requires-python"], ">=3.12")
def test_pyproject_defines_dev_dependencies_and_package_contents(self):
project = self.pyproject["project"]
setuptools_config = self.pyproject["tool"]["setuptools"]
package_find = setuptools_config["packages"]["find"]
package_data = setuptools_config["package-data"]["app"]
self.assertGreaterEqual(
set(project["optional-dependencies"]["dev"]),
{"bandit", "pip-audit", "ruff"},
)
self.assertEqual(setuptools_config["py-modules"], ["main"])
self.assertEqual(package_find["include"], ["app", "app.*"])
self.assertIn("config.sample.yaml", package_data)
self.assertIn("templates/**/*.j2", package_data)
self.assertIn("static/css/*.css", package_data)
self.assertIn("static/js/*.js", package_data)
def test_legacy_requirements_files_are_removed(self):
self.assertFalse((self.repo_root / "requirements.txt").exists())
self.assertFalse((self.repo_root / "requirements-dev.txt").exists())
self.assertFalse((self.repo_root / "app" / "requirements.txt").exists())
def test_package_init_files_exist(self):
self.assertTrue((self.repo_root / "app" / "__init__.py").is_file())
self.assertTrue((self.repo_root / "app" / "utils" / "__init__.py").is_file())
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,43 @@
import unittest
from pathlib import Path
import yaml
SKIP_DIR_NAMES = {".git", ".ruff_cache", "__pycache__", "node_modules"}
SKIP_FILES = {"app/config.yaml"}
YAML_SUFFIXES = {".yml", ".yaml"}
class TestYamlSyntax(unittest.TestCase):
def test_all_repository_yaml_files_are_valid(self):
repo_root = Path(__file__).resolve().parents[2]
invalid_files = []
for path in repo_root.rglob("*"):
if not path.is_file() or path.suffix not in YAML_SUFFIXES:
continue
relative_path = path.relative_to(repo_root).as_posix()
if relative_path in SKIP_FILES:
continue
if any(part in SKIP_DIR_NAMES for part in path.parts):
continue
try:
with path.open("r", encoding="utf-8") as handle:
yaml.safe_load(handle)
except yaml.YAMLError as error:
invalid_files.append((relative_path, str(error).splitlines()[0]))
except Exception as error:
invalid_files.append((relative_path, f"Unexpected error: {error}"))
self.assertFalse(
invalid_files,
"Found invalid YAML files:\n"
+ "\n".join(f"- {path}: {error}" for path, error in invalid_files),
)
if __name__ == "__main__":
unittest.main()

1
tests/lint/__init__.py Normal file
View File

@@ -0,0 +1 @@

View File

@@ -0,0 +1,90 @@
#!/usr/bin/env python3
import ast
import unittest
from pathlib import Path
class TestTestFilesContainUnittestTests(unittest.TestCase):
def setUp(self) -> None:
self.repo_root = Path(__file__).resolve().parents[2]
self.tests_dir = self.repo_root / "tests"
self.assertTrue(
self.tests_dir.is_dir(),
f"'tests' directory not found at: {self.tests_dir}",
)
def _iter_test_files(self) -> list[Path]:
return sorted(self.tests_dir.rglob("test_*.py"))
def _file_contains_runnable_unittest_test(self, path: Path) -> bool:
source = path.read_text(encoding="utf-8")
try:
tree = ast.parse(source, filename=str(path))
except SyntaxError as error:
raise AssertionError(f"SyntaxError in {path}: {error}") from error
testcase_aliases = {"TestCase"}
unittest_aliases = {"unittest"}
for node in tree.body:
if isinstance(node, ast.Import):
for import_name in node.names:
if import_name.name == "unittest":
unittest_aliases.add(import_name.asname or "unittest")
elif isinstance(node, ast.ImportFrom) and node.module == "unittest":
for import_name in node.names:
if import_name.name == "TestCase":
testcase_aliases.add(import_name.asname or "TestCase")
def is_testcase_base(base: ast.expr) -> bool:
if isinstance(base, ast.Name) and base.id in testcase_aliases:
return True
if isinstance(base, ast.Attribute) and base.attr == "TestCase":
return (
isinstance(base.value, ast.Name)
and base.value.id in unittest_aliases
)
return False
for node in tree.body:
if isinstance(node, (ast.FunctionDef, ast.AsyncFunctionDef)) and (
node.name.startswith("test_")
):
return True
for node in tree.body:
if not isinstance(node, ast.ClassDef):
continue
if not any(is_testcase_base(base) for base in node.bases):
continue
for item in node.body:
if isinstance(item, (ast.FunctionDef, ast.AsyncFunctionDef)) and (
item.name.startswith("test_")
):
return True
return False
def test_all_test_py_files_contain_runnable_tests(self) -> None:
test_files = self._iter_test_files()
self.assertTrue(test_files, "No test_*.py files found under tests/")
offenders = []
for path in test_files:
if not self._file_contains_runnable_unittest_test(path):
offenders.append(path.relative_to(self.repo_root).as_posix())
self.assertFalse(
offenders,
"These test_*.py files do not define any unittest-runnable tests:\n"
+ "\n".join(f"- {path}" for path in offenders),
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,25 @@
import unittest
from pathlib import Path
class TestTestFileNaming(unittest.TestCase):
def test_all_python_files_use_test_prefix(self):
tests_root = Path(__file__).resolve().parents[1]
invalid_files = []
for path in tests_root.rglob("*.py"):
if path.name == "__init__.py":
continue
if not path.name.startswith("test_"):
invalid_files.append(path.relative_to(tests_root).as_posix())
self.assertFalse(
invalid_files,
"The following Python files do not start with 'test_':\n"
+ "\n".join(f"- {path}" for path in invalid_files),
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1 @@

View File

@@ -0,0 +1,57 @@
import subprocess
import unittest
from pathlib import Path
import yaml
class TestConfigHygiene(unittest.TestCase):
def setUp(self) -> None:
self.repo_root = Path(__file__).resolve().parents[2]
self.sample_config_path = self.repo_root / "app" / "config.sample.yaml"
def _is_tracked(self, path: str) -> bool:
result = subprocess.run(
["git", "ls-files", "--error-unmatch", path],
cwd=self.repo_root,
check=False,
capture_output=True,
text=True,
)
return result.returncode == 0
def _find_values_for_key(self, data, key_name: str):
if isinstance(data, dict):
for key, value in data.items():
if key == key_name:
yield value
yield from self._find_values_for_key(value, key_name)
elif isinstance(data, list):
for item in data:
yield from self._find_values_for_key(item, key_name)
def test_runtime_only_files_are_ignored_and_untracked(self):
gitignore_lines = (
(self.repo_root / ".gitignore").read_text(encoding="utf-8").splitlines()
)
self.assertIn("app/config.yaml", gitignore_lines)
self.assertIn(".env", gitignore_lines)
self.assertFalse(self._is_tracked("app/config.yaml"))
self.assertFalse(self._is_tracked(".env"))
def test_sample_config_keeps_the_nasa_api_key_placeholder(self):
with self.sample_config_path.open("r", encoding="utf-8") as handle:
sample_config = yaml.safe_load(handle)
nasa_api_keys = list(self._find_values_for_key(sample_config, "nasa_api_key"))
self.assertEqual(
nasa_api_keys,
["YOUR_REAL_KEY_HERE"],
"config.sample.yaml should only contain the documented NASA API key "
"placeholder.",
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,43 @@
import unittest
from pathlib import Path
import yaml
ALLOWED_URL_PREFIXES = ("https://", "mailto:", "tel:")
URL_KEYS = {"url", "imprint", "imprint_url"}
class TestSampleConfigUrls(unittest.TestCase):
def setUp(self) -> None:
repo_root = Path(__file__).resolve().parents[2]
sample_config_path = repo_root / "app" / "config.sample.yaml"
with sample_config_path.open("r", encoding="utf-8") as handle:
self.sample_config = yaml.safe_load(handle)
def _iter_urls(self, data, path="root"):
if isinstance(data, dict):
for key, value in data.items():
next_path = f"{path}.{key}"
if key in URL_KEYS and isinstance(value, str):
yield next_path, value
yield from self._iter_urls(value, next_path)
elif isinstance(data, list):
for index, item in enumerate(data):
yield from self._iter_urls(item, f"{path}[{index}]")
def test_sample_config_urls_use_safe_schemes(self):
invalid_urls = [
f"{path} -> {url}"
for path, url in self._iter_urls(self.sample_config)
if not url.startswith(ALLOWED_URL_PREFIXES)
]
self.assertFalse(
invalid_urls,
"The sample config contains URLs with unsupported schemes:\n"
+ "\n".join(f"- {entry}" for entry in invalid_urls),
)
if __name__ == "__main__":
unittest.main()

1
tests/unit/__init__.py Normal file
View File

@@ -0,0 +1 @@
"""Unit test package for Portfolio UI."""

View File

@@ -0,0 +1,72 @@
import unittest
from pathlib import Path
from tempfile import TemporaryDirectory
from unittest.mock import Mock, patch
import requests
from app.utils.cache_manager import CacheManager
class TestCacheManager(unittest.TestCase):
def test_init_creates_cache_directory(self):
with TemporaryDirectory() as temp_dir:
cache_dir = Path(temp_dir) / "cache"
self.assertFalse(cache_dir.exists())
CacheManager(str(cache_dir))
self.assertTrue(cache_dir.is_dir())
def test_clear_cache_removes_files_but_keeps_subdirectories(self):
with TemporaryDirectory() as temp_dir:
cache_dir = Path(temp_dir) / "cache"
nested_dir = cache_dir / "nested"
nested_dir.mkdir(parents=True)
file_path = cache_dir / "icon.png"
file_path.write_bytes(b"icon")
manager = CacheManager(str(cache_dir))
manager.clear_cache()
self.assertFalse(file_path.exists())
self.assertTrue(nested_dir.is_dir())
@patch("app.utils.cache_manager.requests.get")
def test_cache_file_downloads_and_stores_response(self, mock_get):
with TemporaryDirectory() as temp_dir:
manager = CacheManager(str(Path(temp_dir) / "cache"))
response = Mock()
response.headers = {"Content-Type": "image/svg+xml; charset=utf-8"}
response.iter_content.return_value = [b"<svg>ok</svg>"]
response.raise_for_status.return_value = None
mock_get.return_value = response
cached_path = manager.cache_file("https://example.com/logo/download")
self.assertIsNotNone(cached_path)
self.assertTrue(cached_path.startswith("cache/logo_"))
self.assertTrue(cached_path.endswith(".svg"))
stored_file = Path(manager.cache_dir) / Path(cached_path).name
self.assertEqual(stored_file.read_bytes(), b"<svg>ok</svg>")
mock_get.assert_called_once_with(
"https://example.com/logo/download",
stream=True,
timeout=5,
)
@patch("app.utils.cache_manager.requests.get")
def test_cache_file_returns_none_when_request_fails(self, mock_get):
with TemporaryDirectory() as temp_dir:
manager = CacheManager(str(Path(temp_dir) / "cache"))
mock_get.side_effect = requests.RequestException("network")
cached_path = manager.cache_file("https://example.com/icon.png")
self.assertIsNone(cached_path)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,49 @@
import json
import unittest
from pathlib import Path
from tempfile import TemporaryDirectory
from utils import check_hadolint_sarif
class TestCheckHadolintSarif(unittest.TestCase):
def test_main_returns_zero_for_clean_sarif(self):
sarif_payload = {
"runs": [
{
"results": [],
}
]
}
with TemporaryDirectory() as temp_dir:
sarif_path = Path(temp_dir) / "clean.sarif"
sarif_path.write_text(json.dumps(sarif_payload), encoding="utf-8")
exit_code = check_hadolint_sarif.main([str(sarif_path)])
self.assertEqual(exit_code, 0)
def test_main_returns_one_for_warnings_or_errors(self):
sarif_payload = {
"runs": [
{
"results": [
{"level": "warning"},
{"level": "error"},
],
}
]
}
with TemporaryDirectory() as temp_dir:
sarif_path = Path(temp_dir) / "warnings.sarif"
sarif_path.write_text(json.dumps(sarif_payload), encoding="utf-8")
exit_code = check_hadolint_sarif.main([str(sarif_path)])
self.assertEqual(exit_code, 1)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,39 @@
import unittest
from app.utils.compute_card_classes import compute_card_classes
class TestComputeCardClasses(unittest.TestCase):
def test_single_card_uses_full_width_classes(self):
lg_classes, md_classes = compute_card_classes([{"title": "One"}])
self.assertEqual(lg_classes, ["col-lg-12"])
self.assertEqual(md_classes, ["col-md-12"])
def test_two_cards_split_evenly(self):
lg_classes, md_classes = compute_card_classes([{}, {}])
self.assertEqual(lg_classes, ["col-lg-6", "col-lg-6"])
self.assertEqual(md_classes, ["col-md-6", "col-md-6"])
def test_three_cards_use_thirds(self):
lg_classes, md_classes = compute_card_classes([{}, {}, {}])
self.assertEqual(lg_classes, ["col-lg-4", "col-lg-4", "col-lg-4"])
self.assertEqual(md_classes, ["col-md-6", "col-md-6", "col-md-12"])
def test_five_cards_use_balanced_large_layout(self):
lg_classes, md_classes = compute_card_classes([{}, {}, {}, {}, {}])
self.assertEqual(
lg_classes,
["col-lg-6", "col-lg-6", "col-lg-4", "col-lg-4", "col-lg-4"],
)
self.assertEqual(
md_classes,
["col-md-6", "col-md-6", "col-md-6", "col-md-6", "col-md-12"],
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,74 @@
import unittest
from app.utils.configuration_resolver import ConfigurationResolver
class TestConfigurationResolver(unittest.TestCase):
def test_resolve_links_replaces_mapping_link_with_target_object(self):
config = {
"profiles": [
{"name": "Mastodon", "url": "https://example.com/@user"},
],
"featured": {"link": "profiles.mastodon"},
}
resolver = ConfigurationResolver(config)
resolver.resolve_links()
self.assertEqual(
resolver.get_config()["featured"],
{"name": "Mastodon", "url": "https://example.com/@user"},
)
def test_resolve_links_expands_children_link_to_list_entries(self):
config = {
"accounts": {
"children": [
{"name": "Matrix", "url": "https://matrix.example"},
{"name": "Signal", "url": "https://signal.example"},
]
},
"navigation": {
"children": [
{"link": "accounts.children"},
]
},
}
resolver = ConfigurationResolver(config)
resolver.resolve_links()
self.assertEqual(
resolver.get_config()["navigation"]["children"],
[
{"name": "Matrix", "url": "https://matrix.example"},
{"name": "Signal", "url": "https://signal.example"},
],
)
def test_resolve_links_rejects_non_list_children(self):
config = {"navigation": {"children": {"name": "Invalid"}}}
resolver = ConfigurationResolver(config)
with self.assertRaises(ValueError):
resolver.resolve_links()
def test_find_entry_handles_case_and_space_insensitive_paths(self):
config = {
"Social Networks": {
"children": [
{"name": "Friendica", "url": "https://friendica.example"},
]
}
}
resolver = ConfigurationResolver(config)
entry = resolver._find_entry(config, "socialnetworks.friendica", False)
self.assertEqual(entry["url"], "https://friendica.example")
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,45 @@
import unittest
from pathlib import Path
from tempfile import TemporaryDirectory
from unittest.mock import patch
from utils import export_runtime_requirements
class TestExportRuntimeRequirements(unittest.TestCase):
def test_load_runtime_requirements_reads_project_dependencies(self):
pyproject_content = """
[project]
dependencies = [
"flask",
"requests>=2",
]
""".lstrip()
with TemporaryDirectory() as temp_dir:
pyproject_path = Path(temp_dir) / "pyproject.toml"
pyproject_path.write_text(pyproject_content, encoding="utf-8")
requirements = export_runtime_requirements.load_runtime_requirements(
pyproject_path
)
self.assertEqual(requirements, ["flask", "requests>=2"])
def test_main_prints_requirements_from_selected_pyproject(self):
pyproject_content = """
[project]
dependencies = [
"pyyaml",
]
""".lstrip()
with TemporaryDirectory() as temp_dir:
pyproject_path = Path(temp_dir) / "pyproject.toml"
pyproject_path.write_text(pyproject_content, encoding="utf-8")
with patch("builtins.print") as mock_print:
exit_code = export_runtime_requirements.main([str(pyproject_path)])
self.assertEqual(exit_code, 0)
mock_print.assert_called_once_with("pyyaml")

72
tests/unit/test_main.py Normal file
View File

@@ -0,0 +1,72 @@
import subprocess
import unittest
from pathlib import Path
from tempfile import TemporaryDirectory
from unittest.mock import patch
import main as portfolio_main
class TestMainCli(unittest.TestCase):
def test_load_targets_parses_help_comments(self):
makefile_content = """
.PHONY: foo bar
foo:
\t# Run foo
\t@echo foo
bar:
\t@echo bar
""".lstrip()
with TemporaryDirectory() as temp_dir:
makefile_path = Path(temp_dir) / "Makefile"
makefile_path.write_text(makefile_content, encoding="utf-8")
targets = portfolio_main.load_targets(makefile_path)
self.assertEqual(targets, [("foo", "Run foo"), ("bar", "")])
@patch("main.subprocess.check_call")
def test_run_command_executes_subprocess(self, mock_check_call):
portfolio_main.run_command(["make", "lint"])
mock_check_call.assert_called_once_with(["make", "lint"])
@patch("main.sys.exit", side_effect=SystemExit(7))
@patch(
"main.subprocess.check_call",
side_effect=subprocess.CalledProcessError(7, ["make", "lint"]),
)
def test_run_command_exits_with_subprocess_return_code(
self,
_mock_check_call,
mock_sys_exit,
):
with self.assertRaises(SystemExit) as context:
portfolio_main.run_command(["make", "lint"])
self.assertEqual(context.exception.code, 7)
mock_sys_exit.assert_called_once_with(7)
@patch("main.run_command")
@patch("main.load_targets", return_value=[("lint", "Run lint suite")])
def test_main_dispatches_selected_target(
self, _mock_load_targets, mock_run_command
):
with patch("sys.argv", ["main.py", "lint"]):
portfolio_main.main()
mock_run_command.assert_called_once_with(["make", "lint"], dry_run=False)
@patch("main.run_command")
@patch("main.load_targets", return_value=[("lint", "Run lint suite")])
def test_main_passes_dry_run_flag(self, _mock_load_targets, mock_run_command):
with patch("sys.argv", ["main.py", "--dry-run", "lint"]):
portfolio_main.main()
mock_run_command.assert_called_once_with(["make", "lint"], dry_run=True)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,28 @@
#!/usr/bin/env python3
"""Fail when a hadolint SARIF report contains warnings or errors."""
from __future__ import annotations
import json
import sys
from pathlib import Path
def main(argv: list[str] | None = None) -> int:
args = argv if argv is not None else sys.argv[1:]
sarif_path = Path(args[0] if args else "hadolint-results.sarif")
with sarif_path.open("r", encoding="utf-8") as handle:
sarif = json.load(handle)
results = sarif.get("runs", [{}])[0].get("results", [])
levels = [result.get("level", "") for result in results]
warnings = sum(1 for level in levels if level == "warning")
errors = sum(1 for level in levels if level == "error")
print(f"SARIF results: total={len(results)} warnings={warnings} errors={errors}")
return 1 if warnings + errors > 0 else 0
if __name__ == "__main__":
raise SystemExit(main())

View File

@@ -0,0 +1,30 @@
#!/usr/bin/env python3
"""Print runtime dependencies from pyproject.toml, one per line."""
import sys
import tomllib
from pathlib import Path
DEFAULT_PYPROJECT_PATH = Path(__file__).resolve().parents[1] / "pyproject.toml"
def load_runtime_requirements(
pyproject_path: Path = DEFAULT_PYPROJECT_PATH,
) -> list[str]:
with pyproject_path.open("rb") as handle:
pyproject = tomllib.load(handle)
return list(pyproject["project"]["dependencies"])
def main(argv: list[str] | None = None) -> int:
args = argv if argv is not None else sys.argv[1:]
pyproject_path = Path(args[0]) if args else DEFAULT_PYPROJECT_PATH
for requirement in load_runtime_requirements(pyproject_path):
print(requirement)
return 0
if __name__ == "__main__":
raise SystemExit(main())