1.Elastic stack: analysis of security logs. Introduction

1.Elastic stack: analysis of security logs. Introduction

In connection with the end of sales in Russia of the Splunk logging and analytics system, the question arose, what can this solution be replaced with? After spending time to get acquainted with different solutions, I settled on a solution for a real man - "ELK stack". This system takes time to set up, but as a result, you can get a very powerful system for analyzing the state and quickly responding to information security incidents in the organization. In this series of articles, we will look at the basic (or maybe not) features of the ELK stack, consider how you can parse logs, how to build graphs and dashboards, and what interesting functions can be done using the example of logs from the Check Point firewall or OpenVas security scanner. To begin with, let's consider what it is - the ELK stack, and what components it consists of.

"ELK stack" is short for three open source projects: Elasticsearch, logstash ΠΈ kibana. Developed by Elastic along with all related projects. Elasticsearch is the core of the whole system, which combines the functions of a database, search and analytical system. Logstash is a server-side data processing pipeline that receives data from multiple sources at the same time, parses the log, and then sends it to the Elasticsearch database. Kibana allows users to visualize data using charts and graphs in Elasticsearch. You can also administer the database through Kibana. Let's take a closer look at each system separately.

1.Elastic stack: analysis of security logs. Introduction

logstash

Logstash is a utility for processing log events from various sources, with which you can select fields and their values ​​in a message, you can also configure filtering and editing data. After all the manipulations, Logstash redirects the events to the final data store. The utility is configured only through configuration files.
A typical logstash configuration is a file(s) consisting of several incoming information streams (input), several filters for this information (filter) and several outgoing streams (output). It looks like one or more configuration files, which in the simplest version (which does nothing at all) looks like this:

input {
}

filter {
}

output {
}

In INPUT, we configure which port the logs will come to and by which protocol, or from which folder to read new or constantly overwritten files. In FILTER, we set up the log parser: parsing fields, editing values, adding new parameters or deleting them. FILTER is a message control field that comes to Logstash with tons of editing options. In output, we configure where we send the already parsed log, if it is elasticsearch that sends a JSON request in which fields with values ​​are sent, or as part of debugging, you can output to stdout or write to a file.

1.Elastic stack: analysis of security logs. Introduction

ElasticSearch

Initially, Elasticsearch is a full-text search solution, but with additional conveniences, such as easy scaling, replication, and more, which made the product very convenient and a good solution for high-load projects with large amounts of data. Elasticsearch is a non-relational (NoSQL) JSON document storage and search engine based on Lucene full-text search. The hardware platform is Java Virtual Machine, so the system requires a large amount of CPU and RAM resources to work.
Each incoming message, either with Logstash or using the query API, is indexed as a "document" - analogous to a table in relational SQL. All documents are stored in an index, which is analogous to a database in SQL.

An example of a document in the database:

{
  "_index": "checkpoint-2019.10.10",
  "_type": "_doc",
  "_id": "yvNZcWwBygXz5W1aycBy",
  "_version": 1,
  "_score": null,
  "_source": {
	"layer_uuid": [
      "dae7f01c-4c98-4c3a-a643-bfbb8fcf40f0",
      "dbee3718-cf2f-4de0-8681-529cb75be9a6"
	],
	"outzone": "External",
	"layer_name": [
  	"TSS-Standard Security",
  	"TSS-Standard Application"
	],
	"time": "1565269565",
	"dst": "103.5.198.210",
	"parent_rule": "0",
	"host": "10.10.10.250",
	"ifname": "eth6",
    ]
}

All work with the database is based on JSON requests using the REST API, which either issue documents by index, or some statistics in the format: question - answer. In order to visualize all responses to requests, Kibana was written, which is a web service.

kibana

Kibana allows you to search for data and query statistics from the elasticsearch database, but many beautiful graphs and dashboards are built based on the answers. The system also has the functionality of elasticsearch database administration, in subsequent articles we will consider this service in more detail. And now we will show an example of dashboards on the Check Point firewall and the OpenVas vulnerability scanner that can be built.

An example of a dashboard for Check Point, the picture is clickable:

1.Elastic stack: analysis of security logs. Introduction

An example of an OpenVas dashboard, the picture is clickable:

1.Elastic stack: analysis of security logs. Introduction

Conclusion

We have looked at what it consists of. ELK stack, got acquainted with the main products a little, then in the course we will separately consider writing a Logstash configuration file, setting up dashboards on Kibana, get acquainted with API requests, automation and much more!

So stay tunedTelegram, Facebook, VK, TS Solution Blog), Yandex Zen.

Source: habr.com

Add a comment