KEMBAR78
ELK Stack For DevOps | PDF | Sudo | Java (Programming Language)
0% found this document useful (0 votes)
48 views5 pages

ELK Stack For DevOps

The ELK Stack consists of Elasticsearch, Logstash, and Kibana, which together enable real-time log data searching, analyzing, and visualization essential for DevOps. The document provides a step-by-step guide for installing and configuring each component on a Linux system, including prerequisites, installation commands, and configuration settings. It also covers optional security measures and advanced setups for enhanced functionality and reliability in log management and data analysis.

Uploaded by

antaralaerospace
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views5 pages

ELK Stack For DevOps

The ELK Stack consists of Elasticsearch, Logstash, and Kibana, which together enable real-time log data searching, analyzing, and visualization essential for DevOps. The document provides a step-by-step guide for installing and configuring each component on a Linux system, including prerequisites, installation commands, and configuration settings. It also covers optional security measures and advanced setups for enhanced functionality and reliability in log management and data analysis.

Uploaded by

antaralaerospace
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

ELK Stack Overview for DevOps

ELK Stack is a collection of three open-source tools: Elasticsearch, Logstash, and Kibana.
Together, they provide a powerful platform for searching, analyzing, and visualizing log data
in real-time, which is essential for monitoring and troubleshooting in DevOps.

Components of ELK Stack:

1. Elasticsearch:
o A distributed search and analytics engine designed for scalability and speed.
o Stores, searches, and analyzes big volumes of data quickly.
o Uses a RESTful API and can scale horizontally by adding more nodes.
2. Logstash:
o A server-side data processing pipeline.
o Ingests data from multiple sources simultaneously, transforms it, and sends it
to a “stash” like Elasticsearch.
o Supports a wide range of inputs (e.g., logs, metrics) and outputs (e.g.,
Elasticsearch, files, email).
3. Kibana:
o A visualization layer that works on top of Elasticsearch.
o Provides real-time visualization of data indexed in Elasticsearch.
o Features include dashboards, charts, and maps, which help in interpreting the
log data.

Installation of ELK Stack

Here’s how to install and configure each component on a Linux system.

Prerequisites:

 A Linux-based server (Ubuntu or CentOS recommended).


 Java 8 or later is required for Elasticsearch and Logstash.
 Minimum 4GB RAM, recommended 8GB or more for a production environment.

Step 1: Install Java (OpenJDK 8)


bash
Copy code
# Update your system
sudo apt update

# Install OpenJDK 8
sudo apt install openjdk-8-jdk -y

# Verify Java installation


java -version

Step 2: Install Elasticsearch


bash
Copy code
# Import the Elasticsearch PGP Key
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-
key add -

# Add the Elasticsearch repository to your sources list


sudo sh -c 'echo "deb https://artifacts.elastic.co/packages/7.x/apt stable
main" > /etc/apt/sources.list.d/elastic-7.x.list'

# Update and install Elasticsearch


sudo apt update
sudo apt install elasticsearch -y

# Start and enable Elasticsearch service


sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch

# Verify that Elasticsearch is running


curl -X GET "localhost:9200/"

Step 3: Install Logstash


bash
Copy code
# Install Logstash
sudo apt install logstash -y

# Start and enable Logstash service


sudo systemctl start logstash
sudo systemctl enable logstash

Step 4: Install Kibana


bash
Copy code
# Install Kibana
sudo apt install kibana -y

# Start and enable Kibana service


sudo systemctl start kibana
sudo systemctl enable kibana

# Configure Kibana by editing the configuration file (optional)


sudo nano /etc/kibana/kibana.yml

Step 5: Access Kibana Dashboard

 Open your browser and go to http://<your-server-ip>:5601.


 You’ll see the Kibana dashboard, where you can start configuring your data
visualization.

Step 6: Configure Logstash Pipeline

Create a configuration file for Logstash to define input, filter, and output settings.

bash
Copy code
# Create a new Logstash configuration file
sudo nano /etc/logstash/conf.d/logstash.conf
Sample Logstash Configuration:

plaintext
Copy code
input {
beats {
port => 5044
}
}

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logs-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

 Input: This block defines where Logstash should collect data from (e.g., Beats, File,
etc.).
 Filter: This is where you can parse and transform data.
 Output: Defines where Logstash should send the data, typically to Elasticsearch.

Step 7: Start and Monitor Logstash


bash
Copy code
# Start Logstash with the configuration file
sudo systemctl restart logstash

# Monitor Logstash logs


sudo journalctl -u logstash

Step 8: Test the ELK Stack

 Send some logs to Logstash, and check if they appear in Elasticsearch by using
Kibana.
 You can do this by manually creating a log file or using tools like Filebeat to send
logs to Logstash.

Step 9: Security (Optional)

 Securing Elasticsearch: Configure user authentication, HTTPS, and other security


features.
 Securing Kibana: Enable basic authentication and restrict access using a reverse
proxy or X-Pack.
Step 10: Advanced Setup (Optional)

 Cluster Setup: For high availability, set up multiple nodes for Elasticsearch and
configure them in a cluster.
 Data Backup and Restore: Use tools like Snapshot and Restore for backup purposes.
 Monitoring and Alerting: Set up X-Pack for advanced monitoring and alerting.

Conclusion:

The ELK stack is essential for a DevOps environment, providing a robust solution for log
management and data analysis. By centralizing logs, ELK enables teams to diagnose issues,
monitor application performance, and ensure system reliability efficiently.

ELK Stack Installation Script for Amazon Linux


bash
Copy code
#!/bin/bash

# Update system packages


sudo yum update -y

# Install Java OpenJDK 8


sudo amazon-linux-extras install java-openjdk11 -y

# Add Elasticsearch and Kibana to yum repo


sudo rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

sudo tee /etc/yum.repos.d/elasticsearch.repo <<EOF


[elasticsearch-7.x]
name=Elasticsearch repository for 7.x packages
baseurl=https://artifacts.elastic.co/packages/7.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md
EOF

# Install Elasticsearch, Logstash, and Kibana


sudo yum install elasticsearch logstash kibana -y

# Start and enable Elasticsearch service


sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch

# Start and enable Kibana service


sudo systemctl start kibana
sudo systemctl enable kibana

# Start and enable Logstash service


sudo systemctl start logstash
sudo systemctl enable logstash

# Configure firewall to allow Kibana access (port 5601)


sudo firewall-cmd --zone=public --add-port=5601/tcp --permanent
sudo firewall-cmd --reload

# Display Elasticsearch status


curl -X GET "localhost:9200/"

echo "ELK stack installation completed successfully!"


echo "Access Kibana at http://<your-server-ip>:5601"

How to Run the Script:

1. Save the Script:


o Save the above script to a file on your Amazon Linux instance, for example,
install-elk.sh.
2. Make the Script Executable:

bash
Copy code
chmod +x install-elk.sh

3. Run the Script:

bash
Copy code
sudo ./install-elk.sh

You might also like