Working with queues

We have a similar use case in our system where we’re receiving a large number of requests
(millions) that need to be processed asynchronously by a Temporal workflow. Our initial thought was to abandon using Redis streams and just rely on Temporal task queues, but we were worried about the limitations of the Event History size and Event count for long-running workflows in Temporal.

After some research, we’ve come up with a few solutions that we’re considering:

  • Using a dedicated worker that consumes messages from the Redis stream and starts a new workflow instance for each message. This worker would need to be configured with the appropriate Redis connection information, and should be able to handle the rate of incoming messages. We could also implement a rate limiter on the worker to prevent it from becoming overwhelmed.
  • Using a separate process that consumes messages from the Redis stream and enqueues them into a Temporal task queue. This process could be rate-limited to ensure that it doesn’t enqueue messages faster than they can be consumed by the Temporal worker.
  • Using the Continue-As-New feature to close the current Workflow Execution and create a new Workflow Execution in a single atomic operation.

We’re still in the process of evaluating these solutions and testing their performance, and we’re open to any other suggestions or insights on how to handle this use case. We’re particularly interested in hearing about other people’s experience with using Redis streams in conjunction with Temporal.

Edit: it is a slow queue. the queue fills up and then it can takes months before we clear the queue and we can’t really afford processing it faster (we have a few hundred api clients running concurrently at most) and don’t need to go faster then this).