loading...

AWS – The ELK stack

How to use npm packages with ASP.NET Core 3.1

ELK stands for Elasticsearch-Logstash-Kibana, a combination of open source products that results in a very popular way to visualize logs in an AWS account:

  • Elasticsearch: Based on Apache Lucene, this is a scalable indexing service that is custom built to handle full-text searching. It includes a number of flexible algorithms to help you optimize your search queries.
  • Logstash: A project that enables high rates of data ingestion and includes plugins that can handle most of the common log file formats in use by mainstream applications.
  • Kibana: A user interface tool that provides a means of visualizing data.

The Amazon Elasticsearch Service is a fully managed implementation of Elasticsearch, with built-in Kibana and supported integrations with Logstash.

How to do it…

In this recipe, you will learn how to create an Elasticsearch domain, configure the CloudWatch Logs agent on an EC2 instance, and stream those logs to Elasticsearch so that you can search them with Kibana:

  1. Log in to your account and go to the Elasticsearch dashboard. Click Create a new domain:

Amazon Elasticsearch Service
  1. Choose Development and testing to limit the size of the cluster. Be aware that this recipe will not fall under the free tier! There will be charges associated with setting up a test domain:

Development and testing domain
  1. Give the domain a unique name. Stick with the default instance type:

Elasticsearch domain name
  1. On the next screen, configure the network. The simplest choice for a quick test is Public access, but of course for a production application, you will want to lock down access to the domain. If you choose VPC access (Recommended), be warned that gaining access to your Kibana endpoint will involve a VPN or a proxy server of some kind:

Elasticsearch VPC configuration
  1. The last step is to set up your domain access policy. To limit access to a single user, select Allow or deny access to one or more AWS accounts or IAM users, or simply allow open access (do not make this selection for production systems!). After you complete the setup wizard by clicking Confirm on the final screen, the domain will take a while to become active. When it does, you will have access to the Kibana URL:

Active Elasticsearch domain with Kibana URL
  1. Now, you need some logs to ingest into Elasticsearch. The quickest way to do this is to log in to one of your EC2 instances. (Create an Amazon Linux instance using the Launch an instance recipe in Chapter 4, AWS Compute, if you don’t already have a running instance.)
  2. Add an instance profile to the instance that has the CloudWatchAgentAdminPolicy so that the instance will have permission to write to CloudWatch Logs.
  3. Log into the instance using SSM Session Manager, like we did in Chapter 4, AWS Compute, and run the following commands to install and start the CloudWatch Logs agent on the instance:
$ sudo yum update -y
$ sudo yum install -y awslogs
$ sudo service awslogs start
  1. The default setting should be sufficient to send system logs to CloudWatch, but you can edit the configuration file if you wish:
$ vim /etc/awslogs/awslogs.conf
  1. You should see a new log group show up in the CloudWatch Logs console. If you don’t, check the error logs in /var/logs/awslogs.log:

CloudWatch Log Groups
  1. Select the new /var/log/messages log group and then select Stream to Amazon Elasticsearch Service from the Actions dropdown menu:

Stream to Amazon Elasticsearch Service option
  1. Select your domain and create a new role that will be used by a Lambda function that CloudWatch creates to broker the ingestion. Add the AWSLambdaVPCAccessExecutionRole policy to the default role that is created for you:

Start streaming logs to Elasticsearch
Note that streaming data from CloudWatch to Elasticsearch can result in high-usage charges! Be sure to set a budget and monitor your charges carefully.
      1. Click Next, then choose Common Log Formant on the next screen. Click Next on the following two screens, then click Start Streaming.
      2. Now, you are ready to visualize your logs in Kibana. Click the Kibana URL on the Elasticsearch domain console:

      Kibana
      1. Click Discover and then create an Index Patterns for CloudWatch Logs—cwl-*:

      Create an index pattern
      1. On the next screen, choose @timestamp as a filter, and then after the index pattern is created, go back to the Discover tab to start querying your data:

      Search logs with Kibana
      1. Once you have explored what’s possible with Kibana, delete the Elasticsearch domain in order to prevent future charges associated with the resources that you created in this recipe. 

      How it works…

      When you configure an Elasticsearch domain, an instance is provisioned for you behind the scenes to run the open source software that comprises Elasticsearch and Kibana. The CloudWatch Logs agent that runs on the instances that you want to monitor watches log files according to the configuration that you specify and sends the logs to CloudWatch in batches. CloudWatch, in turn, passes those log entries on to the Elasticsearch domain. Once they have been ingested by Elasticsearch, you can query and search them with Kibana.

      One important thing to keep in mind with this logging solution is that application logs on your instances often log sensitive data, so be sure to safeguard all the aspects of this solution in the same way that you would safeguard customer databases. Many infamous data breaches are the result of web application developers logging things such as usernames and passwords to log files, and then those log files are passed to an unprotected system. Don’t be one of those administrators who simply assume that log files are innocuous and don’t deserve rigorous data protection controls!

      In the preceding recipe, you had the option of configuring an open domain that allows access to the public. While this made completing the recipe easy, when it comes to production applications, lock your domain down to a VPC and apply the lessons you have learned about security to limit access to a small subset of your users.

      There’s more…

      AWS released a feature called CloudWatch Logs Insights that provides much of the functionality offered by Kibana. There may be use cases where the ELK stack is still the right choice for your application, but in many cases, CloudWatch Logs Insights will get the job done, with much less cost and complexity. The Insights feature provides a custom query language that gives you a flexible way to build powerful visualizations of your CloudWatch Logs:

      CloudWatch Logs Insights

      As you can see from the preceding screenshot, a simple query produces a visualization of Lambda function latency characteristics.

      Comments are closed.

      loading...