Graylog Log Aggregation Cluster
Prerequisites
1. Update all servers:
sudo apt update && sudo apt upgrade -y
2. Ensure OpenJDK (Java 8 or higher) is installed on Server 2 for Elasticsearch:
sudo apt install openjdk-11-jdk -y
Step 1: Set Up MongoDB on Server 3
1. Install MongoDB:
wget -qO - https://www.mongodb.org/static/pgp/server-4.4.asc | sudo apt-key add -
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/debian buster/mongodb-org/4.4 main" | sudo tee /etc/apt/sources.list.d/mongodb-org-4.4.list
sudo apt update
sudo apt install -y mongodb-org
2. Start MongoDB and Enable It to Start on Boot:
sudo systemctl start mongod
sudo systemctl enable mongod
3. Create Graylog Database User: Open the MongoDB shell:
mongo
Run the following commands to create a user for Graylog:
use graylog
db.createUser({
user: "graylog",
pwd: "password", // Replace with a secure password
roles: [{ role: "readWrite", db: "graylog" }]
})
exit
Step 2: Install and Configure Elasticsearch on Server 2
1. Download and Install Elasticsearch:
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.10.2-amd64.deb
sudo dpkg -i elasticsearch-7.10.2-amd64.deb
2. Configure Elasticsearch:
Edit the Elasticsearch configuration file:
sudo nano /etc/elasticsearch/elasticsearch.yml
Add the following settings:
cluster.name: graylog-cluster
node.name: graylog-node
network.host: 0.0.0.0
discovery.seed_hosts: []
xpack.security.enabled: false
3. Start and Enable Elasticsearch:
sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch
4. Verify Elasticsearch: Open a browser or use curl:
curl -X GET "http://<Server2_IP>:9200"
Step 3: Install and Configure Graylog on Server 1
1. Install Required Packages:
sudo apt install apt-transport-https openjdk-11-jre-headless uuid-runtime pwgen -y
2. Download and Install Graylog:
wget https://packages.graylog2.org/repo/packages/graylog-4.2-repository_latest.deb
sudo dpkg -i graylog-4.2-repository_latest.deb
sudo apt update && sudo apt install graylog-server -y
3. Generate and Configure Secrets:
- Generate a secret and hash password:
pwgen -N 1 -s 96
- Use echo -n “yourpassword” | sha256sum to hash the password.
4. Edit the Graylog Configuration File:
sudo nano /etc/graylog/server/server.conf
Update the following settings:
password_secret = <generated_secret>
root_password_sha2 = <hashed_password>
root_email = "admin@example.com"
root_timezone = UTC
http_bind_address = 0.0.0.0:9000
elasticsearch_hosts = http://<Server2_IP>:9200
mongodb_uri = mongodb://graylog:password@<Server3_IP>:27017/graylog
5. Start and Enable Graylog:
sudo systemctl daemon-reload
sudo systemctl enable graylog-server
sudo systemctl start graylog-server
6. Access Graylog: Open a web browser and go to http://<Server1_IP>:9000
. Log in with admin
and the password you set.
Step 4: Install Filebeat as a Log Collector
1. Install Filebeat on any server from which you want to collect logs:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.10.2-amd64.deb
sudo dpkg -i filebeat-7.10.2-amd64.deb
2. Configure Filebeat to Forward Logs to Graylog:
Edit the Filebeat configuration file:
sudo nano /etc/filebeat/filebeat.yml
Update the configuration:
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/syslog
- /var/log/auth.log
output.logstash:
hosts: ["<Server1_IP>:5044"]
3. Start and Enable Filebeat:
sudo systemctl enable filebeat
sudo systemctl start filebeat
Step 5: Configure Graylog to Receive Logs from Filebeat
- Set Up an Input in Graylog:
- Go to System > Inputs in the Graylog web interface.
- Choose Beats and click Launch new input.
- Set it to listen on port 5044 and give it a descriptive name.
- Check Logs:
- After configuring the input, you should start seeing logs from the Filebeat agents on the Search page in Graylog.
Step 6: Create Graylog Streams and Alerts
- Set Up Streams:
- Go to Streams > Create Stream in Graylog.
- Name your stream and set rules to filter specific log messages.
- Assign a new stream to a particular log source, such as all logs from Filebeat.
- Set Up Alerts:
- Go to Alerts > Manage Alerts in Graylog.
- Create a new alert for your stream, such as when logs matching specific criteria exceed a threshold.
- Create Dashboards:
- Use the Dashboard feature to visualize data. Add widgets to create a real-time dashboard for key logs and metrics.
Conclusion
You now have a Graylog Aggregation Cluster on Debian 12 that collects logs from multiple servers, stores them in Elasticsearch, and uses Graylog’s search and visualization tools for monitoring and analysis. This setup is scalable; simply add more servers running Filebeat for log collection, or add multiple Elasticsearch nodes for greater storage capacity.