TS SDK signals and continueAsNew

Hi, a question around signals (using ts sdk). If I receive a ton of signals on a workflow, that need to be further handled by 1 or 2 activities, whats the best practice to use continueAsNew without loosing any signals?

I’m using the documented way to listen to signals:
export async function signals(args: Args) {
  const { taskId, integrationId, connection, teamId } = args;

  const pendingDispatches = Array<DispatchPayload>();
  setHandler(dispatchSignal, (dispatch) => {
    pendingDispatches.push(dispatch);
  });
  ... iterate on pending dispatches and continueAsNew if too many loops
}

Lets say that signals are coming in faster than they can be processed (e.g. 10.000 are being added while). Would I need to pass the items to the next run? I’m pretty sure argument size would be too high. Can I somehow stop listening to signals and pick up signals again on the next run?
Does the application need to be remodelled (I’m sure there is ways, just curious if there is a solution though)?

The application has to be remodeled. Temporal guarantees that signals are not lost, but too high rate of signals to a single workflow might preclude this workflow from completing. This particular issue will be fixed at some point.

+1 to what Maxim wrote.
In case the signal rate was a lower your suggested approach would work.
See a sample in our docs Workflows in TypeScript | Temporal Documentation demonstrating this pattern.
This example could be extended to carry signals to the next run when calling continue as new.

1 Like

Thanks for your responses. I’ll remodel the application to create batches with a maximum size of something like 1000, each spawning their own workflow and handling only the batch signals.

What is your use case? If we understand what you are trying to achieve we might give a better recommendation.

ELT/Datapipeline, a root workflow (datapipeline) with one child workflow/activity (extract) and another child (loader), where extract wf signals loader wf new records to be processed.

We moved to persisting data from the extract wf/activity directly to our DB and only signaling batches of items to be further processed @maxim .

1 Like