Huge role refactoring/cleanup. Other commits will propably follow. Because some bugs will exist. Still important for longrun and also for auto docs/help/slideshow generation

This commit is contained in:
2025-07-08 23:43:13 +02:00
parent 6b87a049d4
commit 563d5fd528
1242 changed files with 2301 additions and 1355 deletions

View File

@@ -0,0 +1,4 @@
## restart all services
```bash
docker restart elk_logstash_1 && docker restart elk_elasticsearch_1 && docker restart elk_kibana_1
```

View File

@@ -0,0 +1,34 @@
# ELK Stack
## Warning
For security reasons, this role is not recommended. If you prefer to keep your logs safe without relying on external servers, consider using an alternative tool.
## Overview
This Ansible role deploys and configures an [ELK Stack](https://en.wikipedia.org/wiki/Elastic_stack) (comprising [Elasticsearch](https://en.wikipedia.org/wiki/Elasticsearch), [Logstash](https://en.wikipedia.org/wiki/Elastic_stack), and [Kibana](https://en.wikipedia.org/wiki/Kibana)) using [Docker Compose](https://en.wikipedia.org/wiki/Docker_Compose). The ELK Stack is widely used for centralized log collection, analysis, and visualization of log and machine-generated data.
## Description
This role performs the following tasks:
- **Setup & Configuration:** Installs and configures the three main components—Elasticsearch, Logstash, and Kibana.
- **Template-Driven Adjustments:** Adapts configuration files through templates and variables.
- **Docker Integration:** Deploys the stack using Docker Compose, integrating it into your containerized environment.
- **Service Management:** Handles service restarts and updates through Ansible handlers.
## Purpose
The ELK Stack is primarily used for:
- **Centralized Log Management:** Consolidating logs from various systems into one location.
- **Real-Time Troubleshooting:** Quickly diagnosing issues through live log analysis.
- **Performance Monitoring:** Tracking system performance and identifying anomalies.
- **Security Analysis:** Detecting and investigating security incidents based on log data.
## Features
- **Centralized Log Management:** Collects and aggregates logs from disparate systems.
- **Real-Time Analysis:** Leverages Elasticsearch for fast data search and analytics.
- **Flexible Data Pipelines:** Processes and transforms log data with Logstash.
- **Interactive Visualization:** Creates dashboards and visual reports with Kibana.
- **Scalable & Extensible:** Easily integrates additional tools and custom configurations via templates.
## Credits 📝
Developed and maintained by **Kevin Veen-Birkenbach**.
For more information, visit [www.veen.world](https://www.veen.world).
Part of the [CyMaIS Project](https://github.com/kevinveenbirkenbach/cymais).
License: [CyMaIS NonCommercial License (CNCL)](https://s.veen.world/cncl)

View File

@@ -0,0 +1,2 @@
# Todo
- implement

View File

@@ -0,0 +1,2 @@
---
docker_elk_compose_path: "/srv/github.com/kevinveenbirkenbach/web-app-elk/"

View File

@@ -0,0 +1,2 @@
# https://www.elastic.co/guide/en/elasticsearch/reference/current/vm-max-map-count.html
vm.max_map_count=262144

View File

@@ -0,0 +1,7 @@
---
- name: recreate web-app-elk
command:
cmd: docker-compose up -d --force-recreate
chdir: "{{docker_elk_compose_path}}"
environment:
COMPOSE_HTTP_TIMEOUT: 600

View File

@@ -0,0 +1,23 @@
---
galaxy_info:
author: "Kevin Veen-Birkenbach"
description: "Transform online learning and collaboration with BigBlueButton, an interactive web conferencing solution designed to energize virtual classrooms and meetings. Enjoy dynamic tools and an engaging environment that makes every session a powerful learning experience."
license: "CyMaIS NonCommercial License (CNCL)"
license_url: "https://s.veen.world/cncl"
company: |
Kevin Veen-Birkenbach
Consulting & Coaching Solutions
https://www.veen.world
min_ansible_version: "2.9"
platforms:
- name: Docker
versions:
- "latest"
galaxy_tags:
- elk
- docker
- log-management
- administration
repository: "https://s.veen.world/cymais"
issue_tracker_url: "https://s.veen.world/cymaisissues"
documentation: "https://s.veen.world/cymais"

View File

@@ -0,0 +1,53 @@
---
- name: "include role webserver-proxy-domain for {{application_id}}"
include_role:
name: webserver-proxy-domain
vars:
domain: "{{ domains | get_domain(application_id) }}"
http_port: "{{ ports.localhost.http[application_id] }}"
- name: create elasticsearch-sysctl.conf
copy:
src: "elasticsearch-sysctl.conf"
dest: /etc/sysctl.d/elasticsearch-sysctl.conf
owner: root
group: root
- name: set vm.max_map_count=262144
command:
cmd: sysctl -w vm.max_map_count=262144
- name: "create {{docker_elk_compose_path}}"
file:
path: "{{docker_elk_compose_path}}"
state: directory
mode: 0755
- name: git pull web-app-elk
git:
repo: "https://github.com/kevinveenbirkenbach/web-app-elk.git"
dest: "{{docker_elk_compose_path}}"
update: yes
notify: recreate web-app-elk
ignore_errors: true
- name: copy docker-compose.yml
template: src=docker-compose.yml.j2 dest={{docker_elk_compose_path}}docker-compose.yml
notify: recreate web-app-elk
- name: copy elasticsearch.yml
template: src=elasticsearch.yml.j2 dest={{docker_elk_compose_path}}elasticsearch/config/elasticsearch.yml
notify: recreate web-app-elk
- name: copy kibana.yml
template: src=kibana.yml.j2 dest={{docker_elk_compose_path}}kibana/config/kibana.yml
notify: recreate web-app-elk
- name: copy logstash.yml
template: src=logstash.yml.j2 dest={{docker_elk_compose_path}}logstash/config/logstash.yml
notify: recreate web-app-elk
- name: copy logstash.conf
template: src=logstash.conf.j2 dest={{docker_elk_compose_path}}logstash/pipeline/logstash.conf
notify: recreate web-app-elk

View File

@@ -0,0 +1,67 @@
{% include 'roles/docker-compose/templates/base.yml.j2' %}
elasticsearch:
build:
context: elasticsearch/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./elasticsearch/config/elasticsearch.yml
target: /usr/share/elasticsearch/config/elasticsearch.yml
read_only: true
- type: volume
source: elasticsearch
target: /usr/share/elasticsearch/data
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
ELASTIC_PASSWORD: changeme
# Use single node discovery in order to disable production mode and avoid bootstrap checks.
# see: https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
discovery.type: single-node
logstash:
build:
context: logstash/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./logstash/config/logstash.yml
target: /usr/share/logstash/config/logstash.yml
read_only: true
- type: bind
source: ./logstash/pipeline
target: /usr/share/logstash/pipeline
read_only: true
ports:
- "5044:5044"
- "5000:5000/tcp"
- "5000:5000/udp"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
depends_on:
- elasticsearch
kibana:
build:
context: kibana/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./kibana/config/kibana.yml
target: /usr/share/kibana/config/kibana.yml
read_only: true
ports:
- "127.0.0.1:{{ports.localhost.http[application_id]}}:5601"
depends_on:
- elasticsearch
{% include 'roles/docker-compose/templates/volumes.yml.j2' %}
elasticsearch:
{% include 'roles/docker-compose/templates/networks.yml.j2' %}

View File

@@ -0,0 +1,13 @@
---
## Default Elasticsearch configuration from Elasticsearch base image.
## https://github.com/elastic/elasticsearch/blob/master/distribution/docker/src/docker/config/elasticsearch.yml
#
cluster.name: "web-app-cluster"
network.host: 0.0.0.0
## X-Pack settings
## see https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-xpack.html
#
xpack.license.self_generated.type: basic
xpack.security.enabled: true
xpack.monitoring.collection.enabled: true

View File

@@ -0,0 +1,13 @@
---
## Default Kibana configuration from Kibana base image.
## https://github.com/elastic/kibana/blob/master/src/dev/build/tasks/os_packages/docker_generator/templates/kibana_yml.template.ts
#
server.name: kibana
server.host: 0.0.0.0
elasticsearch.hosts: [ "http://elasticsearch:9200" ]
monitoring.ui.container.elasticsearch.enabled: true
## X-Pack security credentials
#
elasticsearch.username: elastic
elasticsearch.password: {{elastic_search_password}}

View File

@@ -0,0 +1,20 @@
input {
beats {
port => 5044
}
tcp {
port => 5000
}
}
## Add your filters / logstash plugins configuration here
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => "elastic"
password => "{{elastic_search_password}}"
ecs_compatibility => disabled
}
}

View File

@@ -0,0 +1,12 @@
---
## Default Logstash configuration from Logstash base image.
## https://github.com/elastic/logstash/blob/master/docker/data/logstash/config/logstash-full.yml
#
http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]
## X-Pack security credentials
#
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: {{elastic_search_password}}

View File

@@ -0,0 +1 @@

View File

@@ -0,0 +1 @@
application_id: elk