Sending Monitor Results to Elasticsearch

From ver.6.0 and later, Hinemos is equipped with a feature called the Log Transfer Feature (or the Hub feature).

It allows operators to transfer data such as monitor results, logs, etc. which are accumulated in Hinemos Manager to external big data infrastructures.


By default, Hinemos is set to transfer the data to fluentd.

In this article, we will try to transfer the collected data to fluentd and Elasticsearch.


First, prepare the following servers.

  • <Server1>:Installed with Hinemos Manager
  • <Server2>:Installed with fluentd (ver.1.2.3)

→Install fluentd to Server2 and launch td-agent.

Install fluentd by referring to the following website

  • <Server3>:Installed with Elasticsearch and kibana

→Prepare a server with Elasticsearch (ver5.6) and kibana (ver5.6) installed for visualizing the collected data.

Install both tools by referring to the following website.

Install the following plug-ins to Server2 for transferring data from fluentd to Elasticsearch.

1.Setting for sending data to fluentd

Specify the transferring data, transferring interval, destination, etc. on Hinemos Manager.

This time we will use the setting prepared by default to send the data to fluentd, and also send that data to Elasticsearch.

Launch Hinemos Client and open the “Hub” perspective. The Hub[Transfer] view will be displayed on the right-hand side.

Press the “Create” button and add the following URL.

2.Setting for sending data from fluentd to Elasticsearch

Elasticsearch enables users to search through data by reducing the number of data added to the index.

We will set the following conditions when transferring the collected data from Hinemos through fluentd.

  • Each index will only include data from a single transfer setting
  • Subdivide the index by dividing the collected data by their facility ID
  • Switch to a new index every day

Add the following setting to “/etc/td-agent/td-agent.conf”.

The format of the destination URL which matches this setting is as follows.

Note: For x, y, z, and w specify any character string or the variable for the transfer setting.

The prefix “hs” of the pass will be used to identify the data sent from Hinemos with fluentd.

An index in the following format will be created in the Elasticsearch when sending a request with the URL above.

Confirm that the transferred data are output to the log file of “td-agent”

Since we have set the index name and type name to ” strings.event.#[FACILITY_ID].#[YEAR]#[MONTH]#[DAY] “,

a log with the following format will be output.

3.Confirming the Data Transferred to Elasticsearch

Use the following command to confirm the appropriate data are transferred.

This time, we have executed the following command to confirm that the monitor results are transferred properly.

$ curl -XGET http://<servser3のIP>:9200/event.server1.201897/_search?pretty

4.Displaying Data on kibana

Open the browser and enter the following address to connect to kibana.

Select “Management” → “Index Patterns”.

Register the index name (index pattern) to display.

When registration is complete, move to “Discover” screen to confirm that the corresponding logs are displayed.

Since you can register partial match for the Index Patterns, you can enter the date to display logs output during the specified date.

Mobe to the “Discover” screen. Select the date registered to the Index Patterns.

By pressing the “_index” button, you can display the registered index name.

You can perform a narrower search by pressing the ⊕ button next to the index name.

For example, by selecting “event_sever3_201897” you can display the associated logs only.


That’s it for today! Thank you for reading.