ELK Stack for Parsing your Logs

In This Tutorial we will look into parsing your Syslog files and store and display it in an intractable website. We will be using ELK Stack for this purpose before Jumping Into ELK Stack Let’s see what each means

EElasticsearch: open source search and analytics engine, which stores data and relies on Apache Lucene for searching.

LLogStash: Logstash is an open source data collection engine with real-time pipelining capabilities. Logstash can dynamically unify data from different sources and normalize the data into destinations of your choice.

K- Kibana: Analytics and visualization platform designed to work with Elasticsearch.which can be used to search, view, and interact with data stored in Elasticsearch indices.

Let’s setup Nodes to send and receive Log Files

Installing Elasticsearch:

Installing Logstash:

Configuring log stash:

Find the [ v3_ca ] section in the file, and add this line under it:
subjectAltName = IP: logstash_server_ip
cd /etc/pki/tls
sudo openssl req -config /etc/ssl/openssl.cnf -x509 -days 3650 -batch -nodes -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt

This creates cert and key for log stash and log stash forwarder

Configuration File:
Create a log stash Configuration file in
sudo nano /etc/logstash/conf.d/<configfile name>

Save and Restart the lost stash. The Receiving Node should be able to parse Syslog messages and store them in ElasticSearch. In Next Tutorial, we will configure sending node to push Syslog files to our EL Server and also setup Kibana to View Data graphically. For More information on Logstash configurations visit this post.

The article was originally published at MicroPyramid blog

Python, Django, Android and IOS, reactjs, react-native, AWS, Salesforce consulting & development company