Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Hmmm, is the data set something which would fairly naturally fit in a series of SQLite databases?

100GB is way too large for the project I'm working on at the moment (dbhub.io), as even a bunch of people downloading something that large would nuke our sponsorship budget since we're just starting out (still pre-launch).

However, if we gain traction and become cash positive, data sets this size would be good to cater to. :)



I used to keep it in SQLite (much easier to distribute than PostgreSQL). It worked a lot better than many other options I tried. However, rebuilding the database from updated data would take more than a day, and some queries were too slow.

Switching to PostgreSQL sped things up, at the cost of requiring a separate database process, dealing with psql's weird access control, and adding an inconvenient step of loading the data using COPY commands.


Hmmm, sounds like the data itself would be feasible then. SQLite could be considered just a data transport container for this purpose. :)




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: