Remove metadata from paused indexing jobs and therefore allow de-duplication
In !30621 (merged) we implemented the ability to pause indexing jobs.
This uses a ZSET to store the data in Redis. This should de-duplicate the jobs, except that the jobs are also storing metadata (ie. context) alongside the key. This metadata is always going to be unique. So were are often storing the same job multiple times.
This is unnecessary since this queue is already idempotent
and as soon as we go to re-queue the jobs later they will end up being de-duplicated anyway.
The best option is to just remove the metadata from the sorted set and requeue later with new metadata. This will save a lot on Redis memory during long periods of indexing being paused.