What is the best way to exchange large amounts of data between Activities without running into "Complete result exceeds size limit" error?

What is happening under the hood in Temporal, such that I would hit the 2MB BlobSizeLimit if I am passing in and returning a large object into the workflow.ExecuteActivity() function?

Workflow is going to be blocked or activity is going to fail with non retryable error.

And secondly, what would you recommend that I do to resolve this error, if I need to pass relatively large amounts of data (~2MB to 20MB in size when serialized to json) between different Temporal Activities?

There are two approaches:

  1. Store large objects in some external blob store like S3 and pass references as inputs and outputs of activities.
  2. Cache large objects in the process memory or on local disk. Use Go SDK session feature to send activities to the same host that keeps the cache. In this case the workflow has to account for the situation when the host goes down and the whole sequence has to be redone on another host. See the fileprocessing sample that demonstrates this pattern.
1 Like