In this article we will learn How to Monitor Python Application Logs with Elastic Stack. Logging is crucial for monitoring and debugging applications. Logs help you understand how your Python application is running. The Elastic Stack (Elasticsearch, Kibana, Filebeat) makes it easy to collect, store, and visualize logs in real time. In this guide, you’ll learn how to send Python logs to Elasticsearch, visualize in Kibana.
Table of Contents
Prerequisites
- AWS Account with Ubuntu 24.04 LTS EC2 Instance.
- At least 2 CPU cores and 4 GB of RAM for smooth performance.
- Python, pip installed.
Step #1:Setting Up Ubuntu EC2 Instance
Update the Package List to ensure you have the latest versions.
sudo apt update

Install Python, pip, and virtual environments.
sudo apt install python3 python3-pip -y

sudo apt install python3-venv -y

Step #2:Set Up a Python Application with Logging
Now that Python is installed, let’s create our logging application.
First create a Virtual Environment.
python3 -m venv ~/python-app/venv

Activate the Virtual Environment.
source ~/python-app/venv/bin/activate

Install the Required Python Package that enables structured logging in Elastic Common Schema.
pip install ecs-logging

Navigate to the project directory.
cd ~/python-app

Create the Python Application File.
nano elvis.py

Add the following script into elvis.py.
#!/usr/bin/python
import logging
import ecs_logging
import time
from random import randint
# Set up logging
logger = logging.getLogger("app")
logger.setLevel(logging.DEBUG)
handler = logging.FileHandler('elvis.json')
handler.setFormatter(ecs_logging.StdlibFormatter()) # Structured logging
logger.addHandler(handler)
# Sample log messages
messages = [
"Elvis has left the building.",
"Elvis has left the oven on.",
"Elvis has two left feet.",
"Elvis was left out in the cold.",
"Elvis was left holding the baby.",
"Elvis left the cake out in the rain.",
"Elvis came out of left field.",
"Elvis exited stage left.",
"Elvis took a left turn.",
"Elvis left no stone unturned.",
"Elvis picked up where he left off.",
"Elvis's train has left the station."
]
# Generate log entries dynamically
while True:
random1 = randint(0, 15)
random2 = randint(1, 10)
if random1 > 11:
random1 = 0
if random1 <= 4:
logger.info(messages[random1], extra={"http.request.body.content": messages[random1]})
elif 5 <= random1 <= 8:
logger.warning(messages[random1], extra={"http.request.body.content": messages[random1]})
elif 9 <= random1 <= 10:
logger.error(messages[random1], extra={"http.request.body.content": messages[random1]})
else:
logger.critical(messages[random1], extra={"http.request.body.content": messages[random1]})
time.sleep(random2)

Run the Python Script.
python3 elvis.py

Let it run for a few seconds, then stop it with CTRL + C. This script continuously generates log messages and writes them to elvis.json.
Verify the Generated Logs.
cat elvis.json

Step #3:Install and Configure Elasticsearch
Now that the python application is running and generating logs, let’s set up the Elastic Stack to monitor these logs.
Import the Elasticsearch GPG key.
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elastic-keyring.gpg

Add the Elasticsearch repository.
echo "deb [signed-by=/usr/share/keyrings/elasticsearch-keyring.gpg] https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-8.x.list

Now lets update the package list again. The repository is added to the system’s package sources.
sudo apt update

Install Elasticsearch.
sudo apt install -y elasticsearch

Modify Elasticsearch configuration for remote access.
sudo nano /etc/elasticsearch/elasticsearch.yml

Find the network.host
setting, uncomment it, and set it to 0.0.0.0
to bind to all available IP addresses and uncomment the discovery
section to specify the initial nodes for cluster formation discovery.seed_hosts: []

For a basic setup (not recommended for production), disable security features.
xpack.security.enabled: false

Save and exit the editor.
Enable and start Elasticsearch.
sudo systemctl enable elasticsearch sudo systemctl start elasticsearch

Check the status of the elasticsearch to ensure it is running.
sudo systemctl status elasticsearch

Send a GET request to check if Elasticsearch is running and responding. If successful, you should see a JSON response with cluster information.
curl -X GET "localhost:9200"

You can access it using browser with your Public IP address:9200 port which is a default port for Elasticsearch.

Step #4:Install and Configure Kibana
Kibana provides visualization for Elasticsearch data. Install Kibana on the system.
sudo apt install -y kibana

Open the Kibana configuration file for editing.
sudo nano /etc/kibana/kibana.yml

Uncomment and adjust the following lines to bind Kibana to all IP addresses and connect it to Elasticsearch.
server.port: 5601 server.host: "0.0.0.0" elasticsearch.hosts: ["http://localhost:9200"]

Enable and start Kibana.
sudo systemctl enable kibana
sudo systemctl start kibana

Checks the status of Kibana.
sudo systemctl status kibana

Access the Kibana interface by navigating to http://<your-server-ip>:5601
in your web browser. This will open the Kibana dashboard where you can start exploring your data..

Step #5:Install and Configure Filebeat
Filebeat collects and forwards log data to Elasticsearch or Logstash. Install Filebeat on the system.
sudo apt install -y filebeat

Edit the Filebeat configuration file to ship logs of python application from python-app/elvis.json to Elasticsearch.
sudo nano /etc/filebeat/filebeat.yml

Update the filebeat.inputs
and output.elasticsearch
sections as shown below.
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/ubuntu/python-app/elvis.json

output.elasticsearch:
hosts: ["localhost:9200"]

Test the configuration.
sudo filebeat test config

Apply Filebeat setup changes.
sudo filebeat setup

Start and enable the Filebeat service.
sudo systemctl enable filebeat
sudo systemctl start filebeat

Checks the status of filebeat.
sudo systemctl status filebeat

Ensure Elasticsearch is receiving data from Filebeat by checking the indices.
curl -XGET "localhost:9200/_cat/indices?v"
You should see output indicating the presence of indices created by Filebeat.

Step #6:Run the Python Application
Run the Python Script to start and run the python application.
python3 elvis.py

Let it run for a few seconds, then stop it with CTRL + C.
Now go back to Kibana. Scroll down and click on the Logs option in Obeservability in the left-hand navigation menu. If the menu is collapsed, click the Expand icon at the bottom left to reveal the options.

Kibana displays logs data from the last 15 minutes, visualized as a histogram along with individual log messages below. (You may need to adjust the time range.)

You can even see the details of the your Python Application logs. You can see the details of our Cloud provider also some other details.


Conclusion:
You have successfully set up a Python application with structured logging, and configured Elasticsearch, Kibana, and Filebeat to monitor and visualize the logs. This setup ensures that logs from your Python application are efficiently ingested into Elasticsearch using Filebeat, making them available for real-time monitoring and visualization in Kibana. It allows for real-time log analysis, helping you troubleshoot issues and improve application performance. Whether you’re debugging or monitoring performance, this setup makes logging simple and effective.
Related Articles:
How to Install Elastic Stack on Ubuntu 24.04 LTS
Install Elastic Stack on Amazon Linux 2
Send Apache2 Logs to Elastic Stack and Filebeat
Reference: