I want to handle a huge amount of data received over an API using temporal?

My use case is as follows :-

  1. I am capturing huge amount of data(mostly relating to user activity) over an API , Here I am talking about 100K Requests per second
  2. I need to process this data , processing involves
    a. validations
    b. Some Normalisations
    c. Data manipulations.
  3. Then I need to sink this data to a DB (which may additionally requires schema updates , if required as we want to maintain only one Table )

Shall I opt for temporal to solve my use case?? As I need to keep track of failed events at any stage so that I can reflow the data which is failed