How will Temporal store the workflow status if we read the workflow definition from json/yaml in code

Hi Team,

I am actively exploring temporal and have some questions around reading the workflow definition from json/yaml in code. I am pretty new to Temporal so please correct me if it is not right direction.

So, currently we are investigating if we can store the configuration of workflow definition in json/yaml format and translate it to code.

Let’s assume the workflow definition looks like this way (it is very draft version. please treat it as a general idea). It will have entry node_id and the next_node_ids to point us the next step we should go.

{
  "id": 1,
  "root_node_id": 1,
  "nodes": {
    "1": {
      "next_node_ids": [2,3],
      "params": []
    },
    "2": {
      "next_node_ids": [],
      "params": []
    },
    "3": {
      "next_node_ids": [],
      "params": []
    }
  }
}

If we want to read this kind of workflow definition in code, in WorkflowMethod, we will have a “while” loop to infinitely check until there is no next_node_ids. The sudo code will looks like

@Override
  public void execute() {
    // construct workflowSchemas based on json file and get the entry node id
    List<Long> nodeIds = Collections.singletonList(workflowSchemas.getRootNodeId());

    while (!nodeIds.isEmpty()) {
      List<Long> todoIds = Arrays.asList();
      nodeIds.forEach(
          id -> {
            // get current node
            Node node = workflowSchemas.getNodes().get(id.toString());
            // process this node
            ... ...
 
            // find next node ids and add to todoIds
            todoIds.addAll(node.getNextNodeIds())
          });
      // add back to nodeIds to continue the while loops
      nodeIds = todoIds;
    }
  }

With that, I have some questions:

(1) If we have multiple workflow definitions (multiple json files), can we use a generic workflow to process all of them?
(2) With “while” loop, how Temporal store the workflow status? Can it still work the same way as the normal write-as-code workflow?

Let me know if there is better way to solve the dynamic workflow definitions issue. And also let me know if there is more clarification needed for this question. Thanks!

1 Like

Hi @Maggie, I think there are different ways you can solve this, here is just my personal opinion:

a) If you are using Java SDK, you can take advantage of the DynamicWorkflow and DynamicActivity interfaces. Here is also a sample that uses them. This can be useful if you decide to create a single workflow that handles multiple json definitions, especially as it can register dynamic signal and query handlers that could be defined inside your json/yaml definitions.

b) You can write the dsl (json) parsing as a library

c) I assume each one of your “nodes” would be some work that has to be done inside an activity/child workflow. In that case it would be useful to move some of the activity options info (like timeouts/ task queue) into your json definitions as well so you can define them for each activity independently if needed.

To answer your questions, for (1) yes you could, you can also opt for generating the base workflow from your json definition as well (code generation), or you can have a generic workflow defined that takes in your json def as input parameter to its workflow method, up to you. For (2) that should be fine, workflow execution ends when the workflow method completes, so you can have looping structures in it.

We also have a Go SDK sample that uses DSL, and we are working on adding one for Java SDK as well currently.

Hope this helps.

1 Like

Thanks @tihomir! This is helpful! Some followup questions:

(1) You mentioned “you can also opt for generating the base workflow from your json definition as well (code generation)”, do you mean we can explicitly convert the json definition to code? Is there any example I can take a look?

(2) Glad the loop structures works. Wondering if we can recursively call execute() in my question description to do this loop?

And also, very looking forward to a java DSL example! Thanks again!

I would avoid the code generation approach. Writing a single interpreter workflow for your JSON is the way to go.

Also, consider that a single node in your JSON can be implemented as an object with unlimited complexity. For example, it can call multiple activities, child workflows, and wait on various external events in form of signals.

Got it. @maxim Can you help elaborate more “consider that a single node in your JSON can be implemented as an object with unlimited complexity”? Are you suggesting we can generalize a single node in JSON to be any entity(activity (including wait), child workflow) in workflow?

Yes, exactly. One naive mistake is to always map your JSON node to a single activity/child workflow. But in many cases, your users care about higher-level steps than their actual implementation. For example, some DSL can have a “deploy ML model” step which internally can be modeled as dozens of activities and child workflows.

3 Likes

Thanks Maxim for elaboration! @maxim @tihomir last 2 followup questions:

(1) “For the loop structures, Temporal can still restore the workflow status.” For this part, is there any documentation I can take a look to better understand the underlying logic?
(2) Just to clarify, even though we can use interpreter to translate JSON file. Once one workflow starts, we can’t change the json file if without versioning. Otherwise, workflow will run into issue. Is that the case?

(1) It is not only for the loop structures. All data inside the workflow function including local variables, threads that are waiting on blocking calls is fully durable.

(2) It is up to your implementation of the interpreter. You can implement various migration strategies per your business requirements if needed.

2 Likes