HN2new | past | comments | ask | show | jobs | submitlogin

In that specific case, you probably want to roll up that time-series data as it gets older, while keeping the full dataset in a flat file system for data science etc if you need it.

You probably never need a millisecond-granularity data point from 6 months ago in your database.



They probably shouldn't be rows at all. They are effectively low frequency sound files. I'd probably store them in parquet and use a FDW wrapper in Postgres.


I have time series of 2d 64x64 sensor data, resulting in a few billion values that I'm trying to cram into some custom parquet format. I'm often surprised that it's 2021 and we're still stuck in tabular data, with n-dimensional arrays often not even considered.


Thermal or depth map camera? Really depends on how you want to process and query it. At 16k per frame, I'd store each one sequentially. Do you need to only look at a single pixel across a million frames? Or do you process groups of 20-100 frames at a time?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: