Continue-as-new overhead, suitability of using Temporal

Hi all,

We’re considering using Temporal for a use case in which we’ll have 500k-1M workflows running a polling loop at variable rates ranging from once every few minutes to once every few seconds - so maybe an average of 2 polls per workflow per minute. On each iteration, a workflow would run a single activity which hits a gRPC endpoint and returns a small amount of data (~300 bytes). The relevant “state” of the workflow would be determined entirely by the latest activity response.

Would it be more efficient to have such workflows continue-as-new every iteration (passing the latest activity result as input to the new workflow), or should we let them run for a while before continuing-as-new?

Overall, how well-suited is Temporal to such a usage pattern? We’re debating between Temporal and Orleans. Orleans seems like it might put less load on the underlying nodes/DB, but we’re a small team with zero .NET experience and a good amount of Go experience, so we have a strong preference toward Temporal as long as scaling it won’t be too challenging compared to Orleans.


There is no need to use continue-as-new for simple polling: