It says “Sets the maximum number of concurrent Workflow Task Executions the Worker can have.”. On first reading (and implementation using Java SDK), we had an assumption that a configured number of workflows will be executed one by one. That means, if we set this option to 1, than workflows will be executed in FIFO order as taken from task queue. But from practice, looks like workflows just quickly schedule activities and then are just replaced with next workflows, so in fact few workflows are effectively being executed at the same time. While that is great for performance (like “green threads”), this behavior is not helpful when order of workflow execution is important (e.g. batching).
Also, maybe it’s worth mentioning at JavaDoc for this parameter?
BTW, in my case, I have a large input payload that can not be passed by a single workflow. That’s why I’m splitting it externally into batches and starting a workflow per batch. Having some simple way to process these batches in FIFO order would greatly simplify this task. But looks like you are right and I will need a more complex implementation to synchronize these workflows
I would model processing the payload as a batch workflow. Depending on your requirements, you can choose one of the possible batch workflow implementations from the samples.
after some thinking, it looks like it would be easier to invert the control flow. A simple workflow that pulls next batches from the service, processes them and periodically performs CONTINUE_AS_NEW would have FIFO/atomicity/singleton warranties according to
The Continue-As-New feature enables developers to complete the current Workflow Execution and start a new one atomically.