Xeitor
October 6, 2025, 6:20pm
1
Hi there, I am investigating on what is the best way to work with a couple of measurements of ever-increasing counters of some networking devices sampled via SNMP. I was looking into the counter_agg function but I am not sure if It will work correctly for my use case, since the signature is:
counter_agg(ts TIMESTAMPTZ,value DOUBLE PRECISION[, bounds TSTZRANGE]) RETURNS CounterSummary
and since the counter are 64bits I need a numeric unsigned column that supports all possible counter values.
from the docs:
```SQL ,ignore
counter_agg(
ts TIMESTAMPTZ,
value DOUBLE PRECISION¹,
bounds TSTZRANGE DEFAULT NULL
) RETURNS CounterSummary
```
An aggregate that produces a `CounterSummary` from timestamps and associated values.
##### ¹ Note that the `value` is currently only accepted as a `DOUBLE PRECISION` number as most people use that for counters, even though other numeric types (ie `BIGINT`) might sometimes be more intuitive. If you store a value as a different numeric type you can cast to `DOUBLE PRECISION` on input to the function.
### Required Arguments²
|Name| Type |Description|
|---|---|---|
| `ts` | `TIMESTAMPTZ` | The time at each point |
| `value` | `DOUBLE PRECISION` | The value at each point to use for the counter aggregate|
<br>
##### ² Note that `ts` and `value` can be `null`, however the aggregate is not evaluated on `null` values and will return `null`, but it will not error on `null` inputs.
I says that I may cast to double precision, but if I did, wouldn’t it be possible to loose precision if the counters get large enough?