Im not sure I understand the difference in the worker options :
MaxConcurrentWorkflowTaskPollers and MaxConcurrentWorkflowTaskExecutionSize
I see Pollers need to be atleast 2, and I wasnt able to disable stick execution, so there are two pollers each polling a sticky and non-sticky queue? But then if set concurrent execution to 1, the worker processed no workflows. I had to set it to 2 as well for it to process any workflow.
Does it mean, I cannot ensure I get serial processing of workflows in a given task queue?
I wouldn’t touch these configuration for non-performance-related reasons.
Temporal doesn’t support serial processing of workflows as such processing of long running workflows would severely limit system throughput.
What are you trying to achieve from the business point of view?
I was wondering for a use-case that requires serial processing of updates, e.g. imagine a banking application, where someone tries to send money to X first and then to Y. And for some reason they may not have sufficient balance and maybe sending money to X was important that Y, so it would be odd for the user to see that sending money to Y succeeded and sending money to X resulted in insufficient funds because they got executed out of order. I understand global order with multiple producers can be difficult, but atleast with a single producers, if there can be ordering or serial processing? Seems like no, but just wanted to get your thoughts.
Such use case can be solved by relying on uniqueness of workflows by WorkflowID.
WorkflowID can be the ID of the sender. In this case each transfer request can be a signal (or signalWithStart) to the workflow with that ID. Then workflow can process these signals according to your business requirements. For example, one by one.
Interesting. Thank you for sharing this idea.
Can I get more details on this? For instance, is the workflow running for every in a loop waiting for signals and can signals carry some meta-data in terms of parameters necessary for the workflow business logic? E.g. transfer N to X or Y.
When a signal is received, workflow waits for signals and executes business logic (either inline or through a child workflow). This StackOverflow post has a very simplified example.
Thanks for the example, Im wondering is this pattern still possible if the workflow has something to return as a result of a signal processing. E.g. Can a client query if the workflow has handled a signal and what the result of that is? Seems like with a regular workflow execution I get a WorkflowRun which gives me the result. So is there something similar to that?
We are working on the synchronous update feature right now. It will be similar to a signal but would support waiting for its result. In the meantime, you can use the synchronous proxy pattern.
I looked at the synchronous proxy pattern, itseems to me that it works when request and response alternate each other like in the example. But if there are multiple requests at the same time, Im not sure the corresponding responses will be read, itseems since the channel is a fifo, whoever reads the channel first may read the first response as opposed to the response related to their request. Am I understanding this correctly?
The idea is that the proxy workflow still sends signals to the entity workflow that serializes requests.