Using Composition for Workflow implementations, without dependency injection

I recently learned that I shouldn’t inject dependencies into workflows by reading Timeouts From Configurations and the linked posts inside it. I’m trying to refactor an existing workflow to not have injected dependency, but I’m struggling to do so. My project uses Spring Boot for DI, and the workflow implementation looks something like this:

@Component
public class MyWorkflowImpl implements MyWorkflow {
	// HelperA and HelperB call activities as well as perform other business logic.
    // HelperC does not call activities, but is used across multiple workflows to reduce code duplication
	private HelperA helperA;
	private HelperB helperB;
	private HelperC helperC;

    @Autowired
	public MyWorkflowImpl(HelperA helperA, HelperB helperB, HelperC helperC) {
    //...
		this.helperA = helperA;
		this.helperB = helperB;
		this.helperC = helperC;
    //...
	}

	@WorkflowMethod
	public void execute() {
		helperA.doWork();
		helperB.doWork();
		helperC.doWork();
	}
}

I’m using helper classes to make the code more testable, and in some cases share utility methods across different Workflow classes. The helper classes may have their own dependencies, which makes it unpleasant and/or unwise to use new HelperA(......) in the workflow implementation.

I tried these two solutions, which failed:

  1. Create a LocalActivity which has getHelperA, getHelperB and getHelperC methods. This failed because activities can’t return non-POJO objects.
  2. Create an Activity that has the helper classes injected, and calls the doWork methods. This failed because Activities cannot call Activities.

I’d like to be able to keep using these Helper* classes, because it makes testing much easier, as I can verify that the helpers work correctly, without having to test the entire workflow implementation.

How can I achieve this?

public class MyBusinessImpl implements MyBusinessIntface {
	// HelperA and HelperB call activities as well as perform other business logic.
    // HelperC does not call activities, but is used across multiple workflows to reduce code duplication
	private HelperA helperA;
	private HelperB helperB;
	private HelperC helperC;

	public MyWorkflowImpl(HelperA helperA, HelperB helperB, HelperC helperC) {
    //...
		this.helperA = helperA;
		this.helperB = helperB;
		this.helperC = helperC;
    //...
	}

	public void execute() {
		helperA.doWork();
		helperB.doWork();
		helperC.doWork();
	}
}

public class MyWorkflowImpl implements MyWorkflow {

	@WorkflowMethod
	public void execute() {
               MyBusinessIntface b = new MyBusinessImpl(new HelperAImpl(), new HelperBIimpl(), new HelperCImpl());
              b.execute();
	}
}

Can you explain to me what the difference in the two approaches is, from Temporal’s point of view? Why is the Dependency Injection approach problematic, and building the dependencies inside the WorkflowMethod is okay?

Also in the second solution, can I use Spring’s context.getBean(MyBusinessInterface.class), instead of manually new-ing?

public class MyWorkflowImpl implements MyWorkflow {
    @Autowired
    public MyWorkflowImpl(HelperA ....) { ... }

    @WorkflowMethod
	public void execute() {
               helperA.execute();
               helperB.execute();
               helperC.execute();
	}
}
public class MyWorkflowImpl implements MyWorkflow {

	@WorkflowMethod
	public void execute() {
               MyBusinessIntface b = new MyBusinessImpl(new HelperAImpl(), new HelperBIimpl(), new HelperCImpl());
              b.execute();
	}
}
1 Like

Temporal workflow code should be deterministic. The easiest way to break determinism is to inject shared dependencies into the workflow class. Temporal runtime needs to control the workflow instance lifecycle to perform its functions. As I understand

 context.getBean(MyBusinessInterface.class),

by default returns a shared object which is not going to work with Temporal.

3 questions:

  1. It’s correct that context.getBean(MyBusinessInterface.class) returns a shared object by default. If it’s configured to always return a new instance of MyBusinessInterface, then there is no shared dependency. Is that allowed by Temporal?

  2. If the dependencies are stateless, is there a problem? For example, say I have multiple workflows that want to do some transformations on MyObject, save it to Cassandra and send a Kafka record.

public class CassandraAndKafkaHelper {

  public void convertAndStoreAndSend(MyObject obj) {
    cassandraActivity.save(transformToCassandraDto(obj))
    ProducerRecord record = transformToRecord(obj);
    kafkaActivity.sendRecord(record);
  }

  private transformToRecord(...)

  private transformToCassandraDto(...)
}
  1. What if the Workflow dependency uses a class like ObjectMapper in this example? Consumers of JsonUtility can’t make changes to the ObjectMapper, and calls to the public methods shouldn’t cause workflow replays to create different histories. Is this usage of ObjectMapper and JsonUtility below:
    a. Correct if @Scope("prototype"), so that every consumer of JsonUtility gets a unique instance
    b. Correct if @Scope("singleton") so that every consumer of JsonUtility shares the same instance
    c. Correct if every instance of JsonUtility has a unique instance of ObjectMapper
    d. Never correct – if so, what’s the fundamental issue here?
@Component
@Scope("prototype")
public class JsonUtility {
    private final ObjectMapper objectMapper;
    private static final Logger logger = Workflow.getLogger(InternalWorkflowUtility.class);

    @Autowired
    public InternalWorkflowUtility(final ObjectMapper objectMapper) {
        this.objectMapper = objectMapper;
    }

    public String serializeFromPOJOToJSON(Object request) {
        String json = "";
        try {
            json = objectMapper.writeValueAsString(request);
        } catch (JsonProcessingException e) {
            logger.error("Could not serialize {} object as JSON string: {}", request.toString(), e.toString());
        }
        return json;
    }

    public Object deserializeFromJSONtoPOJO(String json, Class<?> responseBodyType) {
        try {
            return objectMapper.readValue(json, responseBodyType);
        } catch (IOException e) {
            logger.error("Exception occurred while deserializing JSON string '{}' into POJO ClassType '{}'", json, responseBodyType);
        }
        return null;
    }
}

I suppose I’m wondering if there is anything fundamentally wrong with injecting dependencies into Temporal Workflows? Or is it more accurate to say that it’s easy to make mistakes when injecting dependencies, especially if you aren’t paying close attention to how Temporal and your DI framework function?

If it’s the latter, that it’s easy to make mistakes when using DI, but DI is not inherently incompatible with Temporal, is there some sort of “checklist” we can go through to verify that our code will function correctly?


Regardless of the answers to the above questions, it could be useful to have a dedicated page in the docs about the right and wrong ways to have a Workflow depend on other classes, be it manually calling new, using DI frameworks, etc.

  1. It’s correct that context.getBean(MyBusinessInterface.class) returns a shared object by default. If it’s configured to always return a new instance of MyBusinessInterface, then there is no shared dependency. Is that allowed by Temporal?

Yes, it is allowed assuming that MyBusinessInterface doesn’t depend on some shared bean and the same bean type is returned for the lifetime of the workflow. This makes IoC much less useful as any change to the configuration can lead to workflow non deterministic failures.

  1. If the dependencies are stateless, is there a problem? For example, say I have multiple workflows that want to do some transformations on MyObject, save it to Cassandra and send a Kafka record.

It is not a problem unless a different implementation is injected during the workflow lifetime.

I suppose I’m wondering if there is anything fundamentally wrong with injecting dependencies into Temporal Workflows? Or is it more accurate to say that it’s easy to make mistakes when injecting dependencies, especially if you aren’t paying close attention to how Temporal and your DI framework function?

There is nothing fundamentally wrong. But in my experience, the majority of tricky bugs in production in Java SDK, at least at AWS were caused by dependency injection mistakes.