Downsampling is a common technique used in time-series data analysis and monitoring systems. For example, in a system that collects high-resolution metrics data every second, downsampling may be applied to reduce the data to 1-minute intervals for long-term storage. This enables cost savings in storage and faster query performance when analyzing historical data.
Explore related concepts
Distributed Tracing
Distributed Tracing is a method for tracking the flow of requests through your application. It enables you to follow the journey of a request as it travels across multiple services, so you can see where things might be going wrong.
Sampling
Sampling is the process of collecting and analyzing only a subset from a larger dataset, in order to make inferences or observations about the larger dataset. Sampling is about selecting a representative sample that represent the key features of the entire dataset without the cost associated with processing and storing the entire dataset.
Observability
Observability is the ability to understand how your application is working and behaving in production through telemetry data (logs, metrics, traces, wide events, etc.). It enables you detect, diagnose and resolve issues in your app before they impact your users and become problems.