Reusing httpx.AsyncClient in workwflow

I have a workflow with several activities. Each activity uses httpx.AsyncClient for remote calls.

I need a way to reuse httpx.AsyncClient connection pool in different activities. What it the best way to do it?

  1. Init AsyncClient as a workflow attribute and pass it to activities?
  2. Combine activities to class and init AsyncClient as an attribute, so that i can invoke self.client in activity methods?
  3. Other solution?

Another question is when to close the client.

Does anyone have this experience?
Thank you.

1 Like

Use (2) as in this sample.

Great, but i cant close client right in WF code after all the activities because of the security reason (all my imports are already under with workflow.unsafe.imports_passed_through():):

temporalio.worker.workflow_sandbox._restrictions.RestrictedWorkflowAccessError: Cannot access threading.local.__mro_entries__ from inside a workflow. If this is code from a module not used in a workflow or known to only be used deterministically from a workflow, mark the import as pass through.

So i need to do a cleanup activity:

    @activity.defn
    async def cleanup(self) -> None:
        await self.client.aclose()

i dont like it because its not business logic at all.

Any suggestions on this?

I’m confused. Are you trying to create an HTTP client instance per workflow instance? Why it is not shared by all the workflows?

Now i have a class of activities. Each activity is an instance method which uses self.client to make http calls. Client is created in class __init__.

In my workflow.run i do this:

activities = Activities()

...
# All the activity calls
...

# Here i want to close shared client
# i can`t do this(error provided on previous post)
# await activities.client.aclose()

# So i have to close client in special activity
await execute_activity(activities.cleanup, ...)

Do you suggest create one shared client for worker instance? I don`t really like those global objects

Thank you.

You create one shared client per activities object implementation instance. The same as db object in the sample I linked above.

Yes, i do the same thing, thank you.

But the example is incomplete, connection pool is never closed. Each e.g. postgres connection is a server process which we doesn’t cleanup. With improper idle configuration it can lead to catastrophe (i know ;))

My question is - what is the proper way to close shared clients from activity class

The activity class is expected to exist for the duration of the worker process which can be months. So I’m not sure why you want to close the pool. I don’t know much about python pools. In Java you could configure all sort of heartbeat and connection release policies at the pool level.

Thanks, now i got the idea!