In this article we will discuss how to set up an ELK stack to properly monitor your dockerized application’s logs.

All the source code is available here.


Contents


First let’s start with the basics.

What is the Elastic Stack?

The Elastic Stack, known as ELK, allows us to reliably and securely take data from multiple sources and different formats using the first component of the stack; Logstash.

Then we can store, search and analyze it through Elasticsearch and finally visualize it in real time using its data visualisation dashboard Kibana.

The Elastic Stack can be seen and applied in a variety of use cases such as security analytics, business analytics, metrics and logs aggregation and analysis.

What are Beats?

Beats is the new member of the ELK Stack. Beats are light weight log data shippers which can push logs to either Logstash for more processing and transformation or to Elasticsearch. They sit on your servers, with your containers, or deployed as functions — and then centralize data in Elasticsearch.

In this tutorial we will use Filebeat which is an open-source lightweight shipper for logs written in go. It uses a backpressure-sensitive protocol when sending data to account for higher volumes of data.

For more info check the official documentation from elastic.

Architecture

After some theoretical introduction about the different components of the ELK stack, let’s dive deeper into the setup of our architecture as illustrated in the diagram below.

image
Figure 1: Architecture

We will create a single endpoint FastAPI application running in a docker container that generates logs and saves them to files. Filebeat reads then those files and transfers them into Elasticsearch.

Setup

  1. Clone this Github repository, change into the corresponding directory and run the following command: docker-compose up -d

    This will start docker containers for each system shown in the above architecture.

  2. Go to http://127.0.0.1:8000/docs where you will see the Swagger UI of our python application and try out the endpoint in order to generate some logs.

  3. Access the kibana dashboard in your web browser through: http://127.0.0.1:5601

    The first thing you have to do is to configure the ElasticSearch indices that can be displayed in Kibana. The file filebeat.yml found under /filebeat/filebeat.yml indicates the used index for our application.

    For further exploration of the possible Filebeat configuration options you can check this filebeat.reference.yml

That’s it! You can now visualize the logs generated by our application in the kibana interface in real time and apply your filters for further analysis.

With this we have reached the end of this post, I hope you enjoyed it!

Recap

In this article we discussed the building blocks and the interaction between them to set up an ELK stack to collect logs from a python application running as a docker container necessary for further troubleshooting and monitoring.

Happy learning!

Resources

https://www.elastic.co/beats/filebeat

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-reference-yml.html