Each week, a new “Do You Know” will be posted on our Elastic Technical Knowledge Hub to share useful knowledge to improve the observability using Elasticsearch. These topics originate from day-to-day challenges we solved for our clients. A stepwise description helps you to successfully implement solutions on improving the performance of your deployment and get the best monitoring of your applications using dashboards and alerting.
This week I will discuss: how you can set up robust monitoring in Elasticsearch within a few minutes using different API calls.
Background
It is most important that a new data source is connected to Elastic in a robust way. This includes setting up Index Life Cycle Management (ILM), Index Template, an alias, data view, Elasticsearch endpoint, and your ingest pipeline. With the use of these components, you can easily make adjustments to your index, such as mappings, setting retention time, performing rollovers and reindexing.
This week I will list a number of API calls that allow you to set up your monitoring in a convenient and quick manner.
Solution
Step 1
Setting up the Index Life Cycle Management (ILM)
In Kibana go to the menu and choose DevTools. Put in the following Put request to create your ILM. I have defined a rollover phase (7 days or 50Gb) and a delete phase (183 days).
PUT _ilm/policy/<given name>
{
"policy": {
"phases": {
"hot": {
"min_age": "0ms",
"actions": {
"rollover": {
"max_age": "7d",
"max_primary_shard_size": "50gb"
},
"set_priority": {
"priority": 100
}
}
},
"delete": {
"min_age": "183d",
"actions": {
"delete": {
"delete_searchable_snapshot": true
}
}
}
}
}
}
Step 2
Setting up the Index Template
In Kibana, go to the menu and choose DevTools. Run the following Put request to create your Index Template. I have defined the linked ILM name, index name and rollover_alias name.
PUT _index_template/template_<given name>
{
"template": {
"settings": {
"index": {
"lifecycle": {
"name": "<ILM_name>",
"rollover_alias": "<given name>"
}
}
},
"mappings": {
"_size": {
"enabled": true
}
}
},
"index_patterns": [
"<index_name>*"
],
"composed_of": []
}
Step 3
Setting up the Rollover Alias Name to include date and number in your index name
In Kibana go to the menu and choose DevTools. Run the following Put request to create your rollover alias name to include date and number.
PUT %3C<given name>-%7Bnow%2Fm%7Byyyy.MM.dd%7D%7D-1%3E
{
"settings": {
"index.lifecycle.name": "<ILM name>",
"index.lifecycle.rollover_alias": "<given name>"},
"aliases": {
"<given name>": {
"is_write_index" : true
}
}
}
Step 4
Perform a rollover on your alias name to check that your index name has the following naming structure: <index_name> – yyyy.MM.dd-00001.
POST <rollover_alias_name>/_rollover
Step 5
Firstly I create an ingest pipeline. Go to Kibana. In the menu go to: Stack Management → Ingest Pipelines → Create pipeline → New pipeline. Give your pipeline a name, like ip_new and click on Create pipeline.
Step 6
Now I am going to create my Elasticsearch endpoint. We need three values to contract the endpoint:
- Elasticsearch endpoint – copy the Elasticsearch endpoint from your deployment
- Index name – This is a given index name in which you want to have the logging
- Ingest pipeline name – This is a given name, in our example that is ip_new
Creating the endpoint. The endpoint is build up like this:
https://<Elasticsearch endpoint>/<index name>/_doc/?pipeline=ip_new
You can test the endpoint by sending documents to the endpoint using Postman for instance.
The data that is sent to the endpoint will always pass through your ingest pipeline. Elastic is presenting a large number of processors to perform these manipulations, in order to do some data manipulation like changing names of fields, grok payloads or remove fields.
Step 7
After data is sent to the Elastic endpoint, a Data View needs to be created in order to see your data. In the Kibana menu go to: Stack Management → Data Views → Create data view. Now type in your given alias name in the index pattern window (always use the wildcard (*) at the end of your index pattern, so that all related indices are grouped in that data view). Finally, give your data view a name and timestamp field.
In the discover page of Kibana you will now find your data that is sent to the Elastic endpoint.
Need help with your Elastic challenges? Contact our experts.
With our 25+ Elastic certified consultants, Devoteam is your partner for developing and implementing Monitoring & Observability solutions that facilitate optimal IT control, from Business & IT Operations dashboards to centralized logging and proactive alerting.