Skip to content

Do you know: how to construct an Elastic Endpoint to receive data in your ingest pipeline?

Each week, a new “Do You Know” will be posted on our Elastic Technical Knowledge Hub to share useful knowledge to improve the observability using Elasticsearch. These topics originate from day-to-day challenges we solved for our clients. A stepwise description helps you to successfully implement solutions on improving the performance of your deployment and get the best monitoring of your applications using dashboards and alerting.

This week I will discuss: how to construct an Elastic Endpoint to receive data in your ingest pipeline.

Background

Data can enter Elasticsearch in different ways. One of them is through the ingest pipelines. Ingest pipelines allow you to perform common transformations on your data before indexing. Before the data can be sent to Elasticsearch, you need to create an Elastic Endpoint and make an ingest pipeline. Here, I discuss how to do it.

Solution

Firstly, I create an ingest pipeline. Go to Kibana. In the menu go to: Stack Management Ingest Pipelines Create pipeline New pipeline. Give your pipeline a name, like ip_new and click on Create pipeline.

Now, I am going to create my Elasticsearch Endpoint. We need three values to contract the endpoint:

  • Elasticsearch Endpoint: copy the Elasticsearch Endpoint from your deployment;
  • Index name: this is a given index name in which you want to have the logging;
  • Ingest pipeline name: this is a given name, in our example that is ip_new.

Creating the endpoint. The endpoint is build up like this:

https://<Elasticsearch endpoint>/<index name>/_doc/?pipeline=ip_new

You can test the endpoint by sending documents to the endpoint using Postman for instance.

The data that is sent to the endpoint will always pass through your ingest pipeline. In order to perform data manipulation, such as changing field names, grokking payloads, or removing fields, Elastic provides a large number of processors for these manipulations.

After data is sent to the Elastic Endpoint, a Data View needs to be created to visualize your data.

In the Kibana menu, navigate to: Stack Management Data Views Create data view New pipeline. Then, type in your given index name, which is written in the Elastic Endpoint, in the index pattern window (always use the wildcard (*) at the end of your index pattern to group all related indices in that Data View).

Finally, give your Data View a name and a timestamp field. In the discover page of Kibana, you will now find your data that was sent to the Elastic Endpoint.

Need help with your Elastic challenges? Contact our experts.

With our 25+ Elastic certified consultants, Devoteam is your partner for developing and implementing Monitoring & Observability solutions that facilitate optimal IT control, from Business & IT Operations dashboards to centralized logging and proactive alerting.