dData migration using temporal workflow

Hi there,
We have a data migration to do which I’m wondering what would be the best way of doing it with temporal. We want to migrate ~ 1000Gb of data from Postgres to Bigtable, I’m wondering if temporal could help us? My idea was using a workflow to save the cursor, and an activity to migrate each row, ( It’s a lot of rows, so not a good idea), should I go with some batch? like each activity being a day of data. What do you think?

Record the progress in heartbeat as described here.

yeah that was another solution, but at the end there is no special advantage of using temporal in that case, vs any other db to store the cursor

The features that Temporal gives out of the box and you have to implement your self when storing cursor in DB:

  1. Heartbeat activity timeout. If the cursor is not updated within timeout the activity is automatically retried.
  2. Ability to run multiple such scans in parallel controlled from a single workflow.
  3. UI/CLI to query progress.