What are best practices for making Temporal GDPR compliant?

Hi Temporal Team,

What could be some of best practices to make temporal GDPR compliant / how are existing cadence / temporal deployments dealing with such concerns ?

2 Likes

Hey, not an expert here but these are some of the most common things I see:

  1. Never store PII directly within the workflows. Instead pass pointers which reference the actual user data which is secure and encrypted.
  2. Limit archival length to make it easier to comply with deletion requests. In the future we plan to make it very easy to surgically remove records from archived histories, but it’s difficult today.

Now if you’re considering #1 I would first look into our DataConverter API. This allows you to automatically encrypt all worker traffic to the core service which means even Temporal server is dealing with the encrypted data. Our architecture does not care if the data is encrypted or not, so this is a great black box solution.

Another thing that is relevant is the length of your workflows. Many Temporal applications model each user as an infinite running workflow. If you happen to use this pattern some extra steps may need to be taken to fully comply with deletion requests (cancelling all associated workflows).

I know this is only a partial answer, so please feel free to push back with more questions.

1 Like

Could you elaborate a bit on #1? I totally understand the importance of a custom data converter that performs encryption / decryption. However, I am not following what you mean by “pass pointers which reference the actual user data”. Are you saying that activity functions can’t return PII (even with a crypto data converter) but returning a pointer to PII is compliant? Same question about local application state within the workflow, I was assuming local state could hold PII as long as a crypto data converter was used.

I’m trying to understand what specifically I need to do (and be aware of) besides using a crypto data converter.

I think it’s more about your preference rather than being “compliant” or not. Temporal workflows and activities run on your code/apps/premises, and as such you have full control on what data you want or do not want to be stored on the Temporal server as part of the workflow event history.

Temporal provides the mentioned, pluggable, data converters that allow you to encrypt/decrypt your data passed to the server and back using your chosen algorithms. On the server side, Temporal server services communicate with each other over mTLS (see these samples: https://github.com/temporalio/samples-server/tree/master/tls and docs Temporal Platform security | Temporal Documentation).
You can also set up for communication between your workers and server to use mTLS as well.
You can also for example build custom CLI plugins for tctl to add further authorization (for example here https://github.com/temporalio/temporal/blob/master/cmd/tools/cli/plugins/authorization/main.go) as well as SSO support for the Web UI for example.

To me, point #1 is just an extra layer of security ontop of all the other things that you can do, meaning that for highly vulnerable data, such as for example credit card information, since again your workflows/activities run on your applications, you can chose not to pass the credit card numbers or even use them inside your workflows, but only tokens, or pointers and the real credit card information is stored under your supervision where you can apply any further security measured you want.

Hope this helps.

yes, that makes sense, thank you!