We are doing some performance testing, where we have a main flow calling subflow(as childflow). The subflow execution is called in the loop, when we increased loop count from 500 to 600 it is failing.
The workflow is terminated with the message “ Workflow history size / count exceeds limit”. Attached the screenshot.
Current implementation
Payload is sent to main flow execution
When the subflow is invoked the complete payload is again sent to subflow, as the nested subflows are possible
This is the main reason I guess for the history size increase.
Thinking of the below solution. Can you please validate the following solution?
Solution1
Maintain the complete payload in the main workflow.
Implement a query method in the main flow to return by taking a subflow name as an argument and return subflow payload and cache it as one of the flow instance variable
When a child flow requires subflow payload, use the query method of the main workflow to get the same.
We have a case where the subflow1 can be executed in parallel(using child flows) where the work is distributed say 10 child flows each handles 10 iterations). I am using the promise to complete the child flow as below.
If the payload is large we recommend storing it in some external store (like S3) and pass only references to it through workflow and activity arguments.
One question I have is “will this work in case of child flow crashes also”?
I’m not sure I correctly understand the question. What do you mean by “child flow crashes”? If the worker process that hosts the child workflow crashes then the recovery will be seamless and nothing should be done. If the child workflow throws an exception the parent workflow will need to handle its failure.
Thanks @maxim . The payload is not more than 2mb, but as we are passing to childflows, which are executing in loop, the history size is increasingly. Just to confirm my question is is it better to pass only reference to childflows and get the subflow/childflow payload from the parentflow by query method.