Saturday 29 August 2020

What’s the best way to tail a huge log database table?

I maintain a node API that logs to a Postgres database table and I have nearly two years of logs saved in that table, which you can imagine has gotten huge. My first thought was to just run a migration that would issue a query to delete older records, but that would only run once because of sequelizes meta table. Then I thought about setting up a chron job to delete old records on a 24 hour sequence and that sounds better to me. But should an api be running a chron job? Would it block anything? I am also open to suggestions.

Submitted August 29, 2020 at 09:34PM by Turdsonahook

No comments:

Post a Comment