Reproducing Broker-Like Decoupled Asynchronous Messaging

We currently have a fairly typical architecture where microservices communicate asynchronously via RabbitMQ exchanges, and we have the typical difficulties with maintenance, scalability, registries, etc.

But one advantage we have is that publishers are decoupled from consumers, who can come and go as they please with publishers none the wiser. Meanwhile, at least from what I understand in my initial review of the documentation about the state machines managed in workflows, that kind of decoupling isn’t possible because you can’t manage state with a consumer that may not exist yet.

I am just curious about the thoughts of the community on this as I am sure my mental model needs shifting. I need to unlearn what I have learned as Yoda put it. How do I provide data to independent consumers who will all do different things and even those who come online later to do something new with the data without changing my workflow code?

Thanks.

There are many valid cases when one-way notifications with decoupled consumers are needed. Temporal doesn’t yet provide any direct support for delivering them.

In my experience, the vast majority of times such one-way notifications are abused to implement business-level flows through choreography. In this case, I believe orchestration is a much better fit and Temporal is the best technology to implement such workflows.

I recommend rethinking your architecture from the use case point of view. Are your services coordinating to achieve some common goal? Then orchestration is a better approach. Are they completely independently consume some events, then publish/subscribe can be a better option.

Thank you for this information. At first blush, I think there is room for
both in my situation. I really like the solution provided by Temporal, and
I will think more about where I can apply it.

1 Like

@maxim May I ask if this is still not supported? I have a very similar usecase: I need to implement a streaming data indexing pipeline which fetches new user reviews in real-time, processes each review in multiple steps and at each step the review is persisted in a different storage, e.g. step 1 persists to a primary doc DB, step 2 persists to an ElasticSearch index, step 3 persists to a vector index. It should be possible to build other streaming pipelines(owned by other product teams) that retrieve processed reviews at different steps for specific usecases. With solutions like Kafka this is straightforward: each step just emits a message to a topic and let other future pipelines subscribe to that topic. How do we do that in Temporal?

With Temporal, it’s even more straightforward. You write a workflow that executes these activities(step1, step2, step3) in a sequence. Something like:

ExecuteActivity(step1)
ExecuteActivity(step2)
ExecuteActivity(step3)

And you have a perfect visibility into what’s going on with each workflow.