Observability Glossary

Aggregation

Linkedin icon
Reddit icon

Aggregation is the process of taking a large dataset and simplifying it into a set of pre-defined parameters such as average, sum or count. It is important especially when dealing with large volumes of data from various sources, as it helps reduce the cost of processing and storing the entire dataset.

Aggregation can be applied to metrics to provide a high-level overview of system health and performance. For instance, key infrastructure metrics such as CPU and memory utilization or error rates are aggrgated to compute summary statistics like average, sum, and count across different dimensions. This enables your to get an overall view of application performance without looking at each individual data point.

It is important to note that heavy aggregation is prominent in monitoring, whereas modern observability platforms do not aggregate data in advance, store all the data with all its granularity; and aggregating only a query time. This gives you access to the full dataset, enabling you to run any calculation on them, without the restriction of pre-aggregated data.

Explore related concepts
Start resolving issues today.
Without the hassle.