Add support for multiple parallel event stream consumers
Currently, background jobs that process the event stream only support a single consumer for each "task". For example, we can only have one consumer with the job of sending new topic links to the Embedly API. This is fine while the event stream volume is low and a single consumer can keep up with it, but will need to be updated to support multiple consumers eventually.
Redis can manage all of the hard parts of this itself through its concept of "consumer groups". From our end, we mostly just need a way to start and manage multiple consumer services for the same task, and give them each a unique consumer name that they'll keep on restart. Right now the consumer name is just hardcoded as <consumer group name>-1
, so we'll probably want to be able to have -2
, -3
, etc.