Insight on Complete result exceeds size limit

Hi all,

i’m trying to manage my wf with some activities that returns about 1MB of datas.
When datas are major than 1MB, i receive this Complete result exceeds size limit error, but if i remove some datas from my result and this datas are minor than 1MB all works fine.
I read about Find cause of "Complete result exceeds size limit" error and i found the BlobSizeLimit that was setted to 2MB, but i don’t find how to configure it.
Why i reached error when data is around 1MB? There is some particular reason for this?
2MB is perfect fit to my all scenarios.

Thx a lot.

You can configure it your dynamicconfig via “limit.blobSize.error”. The value is an int (in bytes) with default value of 2097152 (2mb) - temporal/constants.go at master · temporalio/temporal · GitHub

If the actual input argument size exceeds the warning size (limit.blobSize.warn set to 1/2 mb as default) you should get a warning message “Blob size exceeds limit”. This message is confusing and will do a PR to change it into something better. If it exceeds the max limit the invalid argument error should be "“Blob data size exceeds limit.”. Which one are you getting?

Thx @tihomir for the answer.
Now it’s clear how to configure it.
I can explain my use case that reached this kind of error.
I’ve a lot of sequentially activities within my workflow. One of them return a 1.06MB size dto. I’m working with base configuration of

. This is the error that i reached, but it’s quite strange because the limit is 2MB, correct? Then i tried to remove some entries and had 0.99MB size dto, all worked great.
Can u explain me why i reach this error with base configuration?
Thx a lot!

Yes the default is set to 2mb. Would you mind opening an issue here: Issues · temporalio/temporal · GitHub
and describe the issue so server team can take a look? Make sure you put the server version in the issue. Thanks.

Also, at what point are you measuring the size of your input? Are you measuring it before its serialized and sent to server? If so I think that would not be an accurately represented measurement. The server checks the size limit of the serialized inputs.

Large payloads negatively affect service performance. So I would recommend storing your result in a blob store like S3 and passing the reference as an activity result.