Hello everyone,
I apologize if I’m posting in the wrong section, but I’ve been looking for a solution for almost a week now regarding my academic project on time series data.
The goal of my project is to demonstrate the effectiveness of hypertables compared to standard PostgreSQL tables when handling time series data.
The idea is to take real data (for example, call history) and create two identical tables, except that one will be a regular table and the other will be a hypertable. After inserting around 30,000 rows of data into both tables, I’ll test performance using pg_bench with INSERT and SELECT queries.
The problem is that I’m not getting good results. Despite multiple tests, the standard table seems to outperform the hypertable.
I’m using the following parameters for my tests: C-2, J-2, R-1200, T-60 (my computer is not very powerful).
Could anyone help me understand why the standard table is performing better in this case? Or maybe offer some advice on how to proceed with this project as I’m a bit lost.
The goal is to compare two identical tables (one standard and one hypertable) by running the same SQL queries after inserting the same data. I analyze the performance in terms of TPS, latency, etc.
Thanks a lot for your help!