Sentry/Publish event to Kafka in WorkflowInboudInterceptor

Hi!

I want to achieve the following:

  • Whenever a workflow fails/succeeds I want to send an event to kafka
  • Wherever a workflow fails I want to log to Sentry
  • In future, I would want to extend this for other “interesting” points in the workflow execution

I see that the workflow inbound interceptor needs to be deterministic, however I am unclear whether to use IsReplaying vs model this as an activity (seems overkill)

In general, is IsReplaying positionally aware wrt the call in the workflow? Or will it be true throughout even if a path within the workflow is being taken for the first time after partial replay?

Please assist with the best pattern for achieving this. I see a lot of different opinions on this forum wrt this topic so wanted to see what’s the best pattern as per current sdk

Thanks!

It is OK to use isReplaying to filter out duplicates. There are a few things to watch for when doing this:

  • You have to ignore all errors and continue when publishing is not possible. Another option is to panic on a publishing error. This is going to fail a workflow task. The task is retried, but this is not recommended for long outages.
  • If publishing takes longer than one second it is going to trip the deadlock detector. So you have to either ensure lower latency or publish asynchronously from a separate goroutine.

2 questions

Would using an activity instead yield the same result?

Also when you say goroutine, you mean workflow.Go or vanilla “go”. I assume the latter, since it’s under IsReplaying so it shouldn’t matter

Correct?

When using activity it is going to be retried on its own retry options and workflow can decide if it continues or waits for it to complete.

Vanilla “go”.

Thanks! That worked well

One last question if you are still around

If I use an Activity in my interceptor, I don’t need to worry about IsReplaying etc right?

Yes, the interceptor is executing with the same semantic as the rest of the workflow code.

1 Like