Implementing DSL workflows

Hi,
I am trying to explore temporal w.r.t DSL from json/yaml code. I went through the sample example - samples-java/src/main/java/io/temporal/samples/dsl at main · temporalio/samples-java · GitHub

In my case, i have multiple workflows with number of activities and subworkflows.

I have a few queries and need an understanding on the same :

  1. Do i need to write different interpretors for each DSL/workflow separately? Or can i have the same interpretor for different DSL definitions.
  2. Is there some way to define the activity input and output parameters in DSL ? Reason behind asking this is, as per the sample code - JSONNode is defined as the activity output always, which might differ as per different operations.
  3. Also, how can we handle the workflow output to be a custom java object as response based on workflow type or definition instead of a generic one?
  4. Can we not have multiple activities as part of the dynamic workflow implementation? If yes, then how do we create dynamic activity stub? It would be helpful if there is some sample that can be referred.
  5. Is there any sample code having multiple json definitions and an application logic to run all the workflows?

Let me know if there is a better way to solve all these dynamic workflow usecases.

Thanks.

  1. Do i need to write different interpretors for each DSL/workflow separately? Or can i have the same interpretor for different DSL definitions.

By interpreter you mean a workflow that executes the instructions from json/yaml dsl def? Then yes, you can. The Java SDK DynamicWorkflow interface is perfectly suited imo for that. It would allow you to use the same definition to execute multiple dsl-based workflows.

  1. Is there some way to define the activity input and output parameters in DSL ? Reason behind asking this is, as per the sample code - JSONNode is defined as the activity output always, which might differ as per different operations.

Depends on the DSL. BPMN for example has data inputs and data output mappings for each “Task”. You could read that mapping and figure out what types are needed for activity inputs/results.
Markup-based DSLs like the one used in example in many cases define their data as JSON. The Serverless Workflow DSL in this example does that (entire workflow data is json), and inputs to their “actions” (Temporal activities) is JSON as well, so that’s why its used in the sample.

You don’t have to use JsonNode tho, you can use concrete types as well. You could for example define in this example a “WorkflowData” class, it can look like:

public class WorkflowData {
  private ObjectNode value;

  public WorkflowData() {
    value = new ObjectMapper().createObjectNode();
  }

  public WorkflowData(String data) {
    try {
      if (data == null || data.trim().length() < 1) {
        value = new ObjectMapper().createObjectNode();
      } else {
        value = (ObjectNode) new ObjectMapper().readTree(data);
      }
    } catch (JsonProcessingException e) {
      throw new IllegalArgumentException("Invalid workflow data input: " + e.getMessage());
    }
  }
.....
}

and pass that around as inputs/results instead of JsonNode. There are other options as well.

  1. Also, how can we handle the workflow output to be a custom java object as response based on workflow type or definition instead of a generic one?

In your workflow code you can get the workflow type via:

String type = Workflow.getInfo().getWorkflowType();

based on that type and response for question 2. you can return any custom type you want.

  1. Can we not have multiple activities as part of the dynamic workflow implementation? If yes, then how do we create dynamic activity stub? It would be helpful if there is some sample that can be referred.

In the sample we use a single activity impl, but you can have as many as you want (register them with your worker). Temporal also provides DynamicActivity interface which is shown in this sample.

  1. Is there any sample code having multiple json definitions and an application logic to run all the workflows?

You can do it in the current dsl example, just register all your definitions here.
Are you planning to use the Serverless Workflow DSL as in the example or some other DSL?
If it’s the same one as used in sample, I can definitely help you with some code samples if you wish.

@tihomir Thanks for the explanation and quick response. Basically, I am trying to explore which type of DSL would be an appropriate one as per our business use case. Yes, definitely it would be helpful if you share a few more examples based on Serverless Workflow DSL. Also, if you have any samples of other DSLs as well, it would be great and will be helpful for our comparison.

Further questions :

  1. If my activity is a DynamicActivity which needs to override the execute(), then how can we route the logic to specific activity methods when implementing in a DSL? Because for each activity type, call is triggered to execute(). How to differentiate and execute a logic specific to an activity type. Same query i have for DynamicWorkflow implementation as well.

  2. How to make a call to subworkflow while writing the DSL interpretor code?

definitely it would be helpful if you share a few more examples based on Serverless Workflow DSL

you can see examples here

if you have any samples of other DSLs as well, it would be great and will be helpful for our comparison.

I don’t know which ones you are looking at currently in order to provide info. Step functions and Google cloud workflow have some examples on their websites. For things like bpmn, I am sure there are also many examples out there. Let us know what you are looking for specifically.

  1. If my activity is a DynamicActivity which needs to override the execute(), then how can we route the logic to specific activity methods when implementing in a DSL? Because for each activity type, call is triggered to execute(). How to differentiate and execute a logic specific to an activity type. Same query i have for DynamicWorkflow implementation as well.

If you use DynamicActivity, you can get the activity type via
Activity.getExecutionContext().getInfo().getActivityType();
Given that type, you can just call some method that knows how to handle that type. There is I think no need to “route” to a different activity as activities are just your code, so you can use programming language constructs to call a method that knows what how to invoke a service or whatever the instructions are from the dsl.
DynamicWorkflow is very well suited for dsl implementations. It allows you to define the workflow type inside your dsl itself (as an id for example) and all invocations for any of these types is going to be routed to the workflow impl that implements DynamicWorkflow interface. You can do then multiple things, you can do it as we have in our sample where you cache the parsed workflow dsl definitions up-front, or you can decided to pass the whole workflow definition as input to your workflow when you start it via client api and parse it inside the workflow itself and then execute according to dsl instructions.

  1. How to make a call to subworkflow while writing the DSL interpretor code?

Subworkflows would translate into Temporal child workflows imo.

@tihomir Thanks for the reply. I still do not understand how to invoke child workflows while translating DSL. Since invoking child workflow needs the WorkflowClient object to be initialized. I am facing issue if i try to initialize the WorkflowClient from the interpretor logic.

Since invoking child workflow needs the WorkflowClient

Don’t think this is correct as you can invoke a child workflow from your workflow code as well. This sample shows how you can do that.

There is a number of ways you can invoke a subflow/child workflow given the parameters of the serverless workflow dsl:

  1. Sync invoke and wait for results:

json sample:

"actions": [{
  "subFlowRef": "mySubflow"
}]

temporal:

ChildWorkflowStub cws = Workflow.newUntypedChildWorkflowStub("mySubflow", childWorkflowOptions);
cws.execute(MyReturnObj.class, workflowDataInput);
  1. async child invoke and wait for results:

json sample:

"actions":[
    {
     "subFlowRef": {
      "invoke": "async",
      "workflowId": "mySubflow"
     }
    }
   ],

temporal:

ChildWorkflowStub cws = Workflow.newUntypedChildWorkflowStub("mySubflow", childWorkflowOptions);
Promise<MyReturnObj> promise = cws.executeAsync(MyReturnObj.class, workflowDataInput);
// ...
MyReturnObj result = promise.get();
  1. async with parent close policy:

json sample:

"actions":[
    {
     "subFlowRef": {
      "invoke": "async",
      "onParentComplete": "continue",
      "workflowId": "mySubflow"
     }
    }
   ],

temporal code:

same as in 2. but you would need to set in your ChildWorkflowOptions:

.setParentClosePolicy(ParentClosePolicy.PARENT_CLOSE_POLICY_ABANDON)

Hope this helps.

Yes, this really helps. Will try this solution. Thank you @tihomir !!