{{announcement.body}}
{{announcement.title}}

Storing and Aggregating Time Series Data With Elastic Search

DZone 's Guide to

Storing and Aggregating Time Series Data With Elastic Search

In this article, see a tutorial on how to store and aggregate time series data with elastic Search.

· Database Zone ·
Free Resource

When talking about time series data, the data will be very huge. The number of records increases based on the granularity level. If the granularity is minute, we will get 60 records for one minute for one instance.

For example, we want to store CPU percentage of a device for each minute. So let's assume we are getting data for the last 30 days.

Total no. of records = 1 (device) * 30 (days) * 24 (hours) * 60 (minutes) = 43200 records

In normal use case we may require to store data for thousands of devices for more than 90 days. Just imagine the number of records we have to persist and how we scale the querying that much data with aggregations.

So to store time series data we have to tweak the elastic search index properties to best work with aggregations and space reductions.

Before going any further, please take a stop here.

Making source field false spare us with some space. And take this decision wisely based on the importance of data we are storing. Because there some limitations like we cannot see actual data without aggregations.

Let's actually talk about what we are trying to achieve. Suppose I have minute by minute metric for my device for the last 90 days. In this data, I want to know the metric value for the last month with the daily average. This is something we are getting aggregation of all 30 days of data points.

First, let's build the optimized Elastic index, which stores time series data.

JSON
 




x


1
PUT index_name_example1
2
{
3
    "settings": {
4
        "analysis": {
5
            "normalizer": {
6
                "lower_case_norm": {
7
                    "type": "custom",
8
                    "char_filter": [],
9
                    "filter": [
10
                        "lowercase"
11
                    ]
12
                }
13
            }
14
        }
15
    },
16
    "mappings": {
17
        "type_name_example1": {
18
            "dynamic_templates": [{
19
                    "strings": {
20
                        "match": "*",
21
                        "match_mapping_type": "string",
22
                        "mapping": {
23
                            "type": "string",
24
                            "doc_values": true,
25
                            "index": "not_analyzed"
26
                        }
27
                    }
28
                }
29
            ],
30
            "_source": {
31
                "enabled": false
32
            },
33
            "properties": {
34
                 
35
                "deviceName": {
36
                    "type": "text",
37
                    "fields": {
38
                        "normalize": {
39
                            "type": "keyword",
40
                            "normalizer": "lower_case_norm"
41
                        }
42
                    }
43
                },
44
                "metricName": {
45
                    "type": "text",
46
                    "fields": {
47
                        "normalize": {
48
                            "type": "keyword",
49
                            "normalizer": "lower_case_norm"
50
                        }
51
                    }
52
                },              
53
                "timeStamp": {
54
                    "type": "long"
55
                },
56
                "utilization": {
57
                    "type": "float"
58
                }
59
            }
60
        }
61
    }
62
}
63
 
          


In this, we are considering 4 fields:

  • deviceName — stores the device name
  • metricName — Name of the metric which we storing
  • timeStamp — timepoint at we are storing metric value
  • utilization — Actual value of the metric

This wraps up the schema set up part. Now let's see how to query, to get the aggregated results from the data.

Below is the Elastic query to get the daily average value of the metric of the time range of 1 month.

JSON
 




xxxxxxxxxx
1
42


1
GET index_name_example1/type_name_example1 / _search
2
{
3
    "query": {
4
        "bool": {
5
            "must": [{
6
                    "range": {
7
                        "timeStamp": {
8
                            "gte":1546300800000,
9
                            "lt": 1548979199000
10
                        }
11
                    }
12
                }, 
13
                {
14
                    "term": {
15
                        "deviceName.normalize": {
16
                            "value": "mydevice"
17
                        }
18
                    }
19
                }
20
                }
21
            ]
22
        }
23
    },
24
    "aggs": {
25
        "byday": {
26
            "date_histogram": {
27
                "field": "timeStamp",
28
                "interval": "1d"
29
 
          
30
            },
31
            "aggs": {
32
                "NAME": {
33
                    "max": {
34
                        "field": "utilization"
35
                    }
36
                }
37
            }
38
        }
39
    },
40
    "size": 0
41
}
42
}


Basically, we are using date_histogram and sub aggregation to provide interval while aggregating data.

We can further provide different intervals like hourly, weekly, monthly.

Based on the above process we can work on storing times series data in Elastic search with storage optimization. 

Topics:
database ,elastic search ,metrics monitoring ,tutorial

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}