What would be the ideal postgresql specifications for 100 workflow per day? How much IOPs, storage and RAM would be required? I know a load test should be run on this to figure out the specs but the deployment needs to be done in a hurry thats why.
I deployed Temporal v1.18 to an OpenShift cluster with each service in a diffferent deployment. The backend was an external Postgres DB server (a 4 vCPU, 16GB VM on an all-flash SAN) with max connections set 100. The load testing against it was done using Maru (GitHub - temporalio/maru: Benchmarks for Temporal workflows) which I had it run 5000 workflows each containing 3 activities which had small input/outputs of up to 256 bytes:
Workflows/sec = ~75**
Activities/sec = ~230**
The DB only hit between 40% and 50% CPU usage so there is some follow-up work to do to look at tuning the Temporal services and workers. I got better performance scaling out Temporal services but found that max connections on Postgres needed increasing.
Thanks for the reply. Whatβs the storage spec needed for postgres for such a scenario?
Sorry I cannot tell you that as it is all obfucated from me. I just know we had a VM β VMDK β VMFS β All Flash SAN. I assume it is capable of tens of thousands of IOPs.