mirror of
https://github.com/VictoriaMetrics/VictoriaMetrics.git
synced 2024-11-21 14:44:00 +00:00
docs/Single-server-VictoriaMetrics.md: clarify that the storage size depends on the number of samples per series
This commit is contained in:
parent
c9229e3c0b
commit
65b4ae95e3
1 changed files with 2 additions and 1 deletions
|
@ -1113,7 +1113,8 @@ A rough estimation of the required resources for ingestion path:
|
|||
|
||||
* Storage space: less than a byte per data point on average. So, ~260GB is required for storing a month-long insert stream
|
||||
of 100K data points per second.
|
||||
The actual storage size heavily depends on data randomness (entropy). Higher randomness means higher storage size requirements.
|
||||
The actual storage size heavily depends on data randomness (entropy) and the average number of samples per time series.
|
||||
Higher randomness means higher storage size requirements. Lower average number of samples per time series means higher storage requirement.
|
||||
Read [this article](https://medium.com/faun/victoriametrics-achieving-better-compression-for-time-series-data-than-gorilla-317bc1f95932)
|
||||
for details.
|
||||
|
||||
|
|
Loading…
Reference in a new issue