Reliable writes to a datastore

Hi folks. I am building a proof-of-concept using Temporal. At $employer, we have a need to write (peak) ~100 messages/s to a datastore asynchronously with retries. Typically, we would put this into RabbitMQ and have a worker consume it, then hand-roll retries, alerting, tracing, etc.,

At a high level, some of the options I can think of are:

  1. Start a new workflow for each distinct business entity. Clients signals workflows for based on the business entity in each message. (EDITED)
  2. Create a pool of workflows that read from another queue (RabbitMQ). Each workflow read a message from the queue, does a write to a remote DB reliably, then ContinueAsNew to pick up the next message.
  3. A long running workflow that receives messages as signals and execute write activities. I suspect this needs to ContinueAsNew after a while to prune history size.

Would love to hear your thoughts.


I think you can avoid rabbit, as temporal internally has queues and gurantees that signals will be procssed in the order they arrive.

Your workflow could be as simple as receiving a signal (it gets queues automatically) and having an activity to write stuff to database ( with necessary retry options).

I think moneybatch and accounttransfer samples could be very close to what you are looking for.

Hi Madhu. Option 1 and 3 are similar to the moneybatch example. The difference between 1 and 3 is that…

With option 1, one would create one workflow execution per account (i.e., the business entity). With option 3, one would create a single workflow execution for all accounts. With option 1, you create more workflow executions but each execution receives little (or zero) signals. With option 3, you create a single workflow execution across all accounts which receives around 100 signals per second.

Having lots of workflows that do nothing seems… weird. Though it’s been said that Temporal is great at handling large number of concurrent workflows.

Thank you for your advice!

I’m not sure what value Temporal gives you (despite exponential unlimited retries) over RabbitMQ for a single activity workflow. Temporal shines when you need some additional capabilities besides executing a single activity per workflow.

Have you looked at your business process end to end? What is the component that publishes to queue/starts workflows? Can it be part of the overall workflow?

That’s good insight. The overall business process is routing SMS messages. I am not entirely familiar with all the components. Based on what I know, there are a few steps:

  • Cleansing, enriching, and transforming the input
  • Choosing a SMS router
  • Call the router’s delivery API
  • Receive possible delivery report
    – Forward a copy of the delivery report to the SMS sender
  • Receive possible response (e.g., stop sending SMS to this number)
    – Record response in datastore

The entire flow may be modeled in Temporal. Each execution has a relatively short lifetime with few events, but there can be a lot of executions.

I would certainly model the whole process as a workflow. This way Temporal would greatly simplify your implementation.

Each execution has a relatively short lifetime with few events, but there can be a lot of executions.

What is the peak rate of executions per second do you anticipate?