Redis bigkeys analysis: trivial automated data extraction
From #320 (closed):
Using the redis-cli --bigkeys
tool, periodically extract a report from redis. Do so hourly to start with, but ramp it up or down based on what the data looks like (how it changes over time), kept for 5 years (will still only be in the 10s of MB total size)
Do it on a secondary redis node, not the primary (it is CPU intensive). Achieve this by attempting to run on all the redis nodes with a small random splay (systemd timer), but give up immediately on the primary, and have a shared lock (in redis for relative simplicity) to distinguish between the two secondaries. Put the output into a GCS bucket, in JSON format.
Write some simple CLI tooling to make it easy to fetch/view; it needn't be super-slick, just needs to be ergonomic to start with. We can enhance it as our needs become clearer, but we need a base to work from.
Possible/probable extension: Create a weekly confidential issue (scalability tracker) with a simple report with perhaps simple graphs, trends, etc.