Tracking the flow of events among various microservices requires distributed logging, an essential aspect of microservices design. In a microservices architecture, applications are broken down into smaller, independent services that communicate over a network. To understand system behavior and performance, log data from each service needs to be collected and analyzed.
Distributed logging involves gathering log information from multiple services and centralizing it. Tools like ElasticSearch and Kibana, which support distributed tracing, are commonly used. These frameworks provide APIs for developers to instrument their applications to record trace data, detailing system requests and responses.
This approach helps identify the root cause of issues and optimize overall system performance by capturing monitoring data from various services. It also enables real-time system monitoring and problem resolution.
However, distributed logging can be challenging. The large volume of log data generated by microservices can be overwhelming to manage and analyze. Additionally, ensuring a reliable and scalable logging infrastructure to collect data from many services can be difficult.
Implementing Serilog
Let’s implement a simple console application to simulate the loggings.
dotnet new console -n DistributedLoggingDemo
cd DistributedLoggingDemo
Serilog can be installed with the following Nuget packages.
dotnet add package Serilog
dotnet add package Serilog.Sinks.Console
dotnet add package Serilog.Sinks.Elasticsearch
dotnet add package Serilog.Formatting.Compact
After the installation of Serilog Nuget package, we can implement the logger class in our application.
using System;
using Serilog;
using Serilog.Formatting.Compact;
class Program
{
static void Main(string[] args)
{
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.Console(new CompactJsonFormatter())
.WriteTo.Elasticsearch(new Serilog.Sinks.Elasticsearch.ElasticsearchSinkOptions(new Uri("http://localhost:9200"))
{
AutoRegisterTemplate = true,
IndexFormat = "distributed-logs-{0:yyyy.MM.dd}"
})
.CreateLogger();
try
{
Log.Information("Application Starting");
// Simulate application work
for (int i = 0; i < 10; i++)
{
Log.Information("Processing item {ItemNumber}", i);
}
Log.Information("Application Ending");
}
catch (Exception ex)
{
Log.Fatal(ex, "Application failed");
}
finally
{
Log.CloseAndFlush();
}
}
}
Execute the application by running:
dotnet run
Setting Up Kibana and ElasticSearch
ElasticSearch
ElasticSearch is a distributed, RESTful search and analytics engine capable of addressing a growing number of use cases. As the heart of the Elastic Stack, it centrally stores your data so you can discover the expected and uncover the unexpected.
- Scalability: ElasticSearch is designed for horizontal scalability, allowing it to handle large volumes of data and queries.
- Real-Time Search and Analytics: ElasticSearch provides real-time search capabilities and powerful analytics features, making it ideal for applications requiring quick insights from large datasets.
- Full-Text Search: ElasticSearch is known for its powerful full-text search capabilities, including support for complex queries and multi-language search.
- Integration with the Elastic Stack: ElasticSearch seamlessly integrates with other tools in the Elastic Stack, such as Kibana, Logstash, and Beats, providing a comprehensive solution for data ingestion, storage, analysis, and visualisation.
Kibana
Kibana is an open-source data visualisation and exploration tool used for log and time-series analytics, application monitoring, and operational intelligence use cases. It is a part of the Elastic Stack and provides a powerful interface to visualise data stored in ElasticSearch.
- Interactive Visualisations: Kibana allows you to create dynamic and interactive visualisations like histograms, line graphs, pie charts, and maps.
- Dashboards: You can build and share dashboards that provide real-time insights into your data, enabling you to monitor your applications and infrastructure effectively.
- Exploratory Data Analysis: Kibana provides powerful tools for exploring and analysing your data, including filtering, querying, and aggregating data in various ways.
- User-Friendly Interface: Kibana’s intuitive, user-friendly interface makes it easy to visualise and explore your data, even for non-technical users.
- Security and Access Controls: With Elastic Security, Kibana allows you to secure your data and define access controls, ensuring that only authorised users can access sensitive information.
Now we will implement the docker-compose file that will pull the images of Kibana and ElasticSearch from DockerHub, run the containers and set up the network and environment for their interaction.
version: '3.7'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.0
container_name: elasticsearch
environment:
- xpack.monitoring.enabled=true
- xpack.watcher.enabled=false
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- discovery.type=single-node
ports:
- "9200:9200"
volumes:
- elasticsearch-data:/usr/share/elasticsearch/data
networks:
- elastic
kibana:
image: docker.elastic.co/kibana/kibana:7.17.0
container_name: kibana
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_URL=http://elasticsearch:9200
depends_on:
- elasticsearch
networks:
- elastic
networks:
elastic:
driver: bridge
volumes:
elasticsearch-data:
We can run the file by executing a simple command:
docker-compose up -d
Visualising Logs in Kibana
Once you have your logs being sent to ElasticSearch, you can use Kibana to visualise and analyse these logs. Kibana provides a powerful and user-friendly interface for exploring your data. Follow these steps to set up Kibana and start visualising your logs:
Step 1: Open Kibana in Your Browser
Open your browser and navigate to Kibana, which should be running on http://localhost:5601.
Step 2: Create an Index Pattern
To visualise your logs, you need to create an index pattern in Kibana:
- Go to Management: In the Kibana sidebar, click on Management.
- Select Stack Management: Under the Management section, click on Stack Management.
- Index Patterns: In the Stack Management section, click on Index Patterns.
Step 3: Create a New Index Pattern
- Create Index Pattern: Click on the Create index pattern button.
- Define Index Pattern: Enter distributed-logs-* in the Index pattern field. This pattern matches the index names created by SeriLog in ElasticSearch.
- Set Time Field: Select @timestamp from the Time field dropdown. This field is automatically generated by SeriLog and is used by Kibana to filter and sort log entries by time.
- Save Index Pattern: Click the Create index pattern button to save your new index pattern.
Step 4: Navigate to Discover
- Go to Discover: In the Kibana sidebar, click on Discover. This is where you can explore and query your log data.
- Select Index Pattern: Ensure that the newly created distributed-logs-* index pattern is selected from the index pattern dropdown at the top left.
View Logs: You should now see your logs in the Discover view. You can use the search bar at the top to filter logs, and the time filter to narrow down the logs to a specific timeframe.
Conclusion
In this blog, we set up a distributed logging system using ElasticSearch, Kibana, and SeriLog. This setup allows you to centralise logs from multiple sources, making it easier to monitor and analyse your distributed applications. With the power of ElasticSearch and Kibana, you can create insightful visualisations and dashboards to gain deeper insights into your system’s behaviour.