Incessant "context deadline exceeded" errors after upgrading server images to v1.19.1

  • Temporal server version: v1.19.1
  • Temporal Go SDK version: v1.20.0

We run Temporal in Minikube as part of our integration tests. After upgrading our Temporal deployment from v1.18.4 to v1.19.1, our Temporal deployment in minikube has become extremely unstable. We had no issues with v1.18.4.

My original hypothesis was that we were not allocating enough CPUs to each of the Temporal microservices. With v1.18.4, we used the following allocations for each of frontend, matching, history, and worker:

        resources:
          requests:
            memory: 512Mi
            cpu: 500m
          limits:
            memory: 1024Mi
            cpu: 1000m

After upgrading to v1.19.1, those same allocations (above) resulted in a Temporal deployment that was extremely unstable, about 90% of our integration test runs would result in a Temporal deployment that was entirely unreachable. I’ve pasted error logs at the bottom of this post.

I have tried deploying our Minikube cluster on a VM that has 64 CPUs (increased from 16 previous), and I have increased the CPU allocation by more than 10x to the following for each of the 4 microservices:

        resources:
          requests:
            memory: 2400Mi
            cpu: 6500m
          limits:
            memory: 3600Mi
            cpu: 8000m

This reduced the instability so that it only happens on 25% of test runs, but this is still too high. It also seems like a huge CPU allocation when compared to the CPU allocation that was required by v1.18.4.

Questions for your team:

  • Was there some sort of change between versions 1.18.4 and 1.19.1 that would result in a much higher CPU requirement? If so, is such a huge increase in CPU requirements expected?
  • From the logs below, can someone more familiar with the codebase point me in the right direction for debugging?

temporal-worker errors:

{"level":"info","ts":"2023-02-13T22:25:51.879Z","msg":"Started Worker","service":"worker","Namespace":"temporal-system","TaskQueue":"temporal-sys-per-ns-tq","WorkerID":"server-worker@1@temporal-svc-worker-66f7f58849-hrlpb@temporal-system","logging-call-at":"pernamespaceworker.go:410"}
{"level":"info","ts":"2023-02-13T22:25:55.682Z","msg":"Current reachable members","service":"worker","component":"service-resolver","service":"history","addresses":["10.244.0.17:11403"],"logging-call-at":"rpServiceResolver.go:283"}
{"level":"info","ts":"2023-02-13T22:26:01.670Z","msg":"Started Worker","service":"worker","Namespace":"default","TaskQueue":"temporal-sys-per-ns-tq","WorkerID":"server-worker@1@temporal-svc-worker-66f7f58849-hrlpb@default","logging-call-at":"pernamespaceworker.go:410"}
{"level":"error","ts":"2023-02-13T22:26:01.690Z","msg":"error starting temporal-sys-tq-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:01.691Z","msg":"error starting temporal-sys-history-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:12.621Z","msg":"error starting temporal-sys-tq-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:12.669Z","msg":"error starting temporal-sys-history-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:24.427Z","msg":"error starting temporal-sys-history-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:24.608Z","msg":"error starting temporal-sys-tq-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:37.628Z","msg":"error starting temporal-sys-history-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}
{"level":"error","ts":"2023-02-13T22:26:38.296Z","msg":"error starting temporal-sys-tq-scanner-workflow workflow","service":"worker","error":"context deadline exceeded","logging-call-at":"scanner.go:232","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflow\n\t/home/builder/temporal/service/worker/scanner/scanner.go:232\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry.func1\n\t/home/builder/temporal/service/worker/scanner/scanner.go:209\ngo.temporal.io/server/common/backoff.ThrottleRetry.func1\n\t/home/builder/temporal/common/backoff/retry.go:170\ngo.temporal.io/server/common/backoff.ThrottleRetryContext\n\t/home/builder/temporal/common/backoff/retry.go:194\ngo.temporal.io/server/common/backoff.ThrottleRetry\n\t/home/builder/temporal/common/backoff/retry.go:171\ngo.temporal.io/server/service/worker/scanner.(*Scanner).startWorkflowWithRetry\n\t/home/builder/temporal/service/worker/scanner/scanner.go:208"}

temporal-history logs:

{"level":"info","ts":"2023-02-13T22:25:55.633Z","msg":"Current reachable members","service":"history","component":"service-resolver","service":"frontend","addresses":["10.244.0.16:11401"],"logging-call-at":"rpServiceResolver.go:283"}
{"level":"error","ts":"2023-02-13T22:25:55.633Z","msg":"start failed, rolling back","component":"fx","error":"context deadline exceeded","logging-call-at":"fx.go:1030","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/temporal.(*fxLogAdapter).LogEvent\n\t/home/builder/temporal/temporal/fx.go:1030\ngo.uber.org/fx.(*App).start\n\t/go/pkg/mod/go.uber.org/fx@v1.18.2/app.go:681\ngo.uber.org/fx.withTimeout.func1\n\t/go/pkg/mod/go.uber.org/fx@v1.18.2/app.go:784"}
{"level":"error","ts":"2023-02-13T22:25:55.634Z","msg":"rollback failed","component":"fx","error":"context deadline exceeded","logging-call-at":"fx.go:1033","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/temporal.(*fxLogAdapter).LogEvent\n\t/home/builder/temporal/temporal/fx.go:1033\ngo.uber.org/fx.(*App).start\n\t/go/pkg/mod/go.uber.org/fx@v1.18.2/app.go:684\ngo.uber.org/fx.withTimeout.func1\n\t/go/pkg/mod/go.uber.org/fx@v1.18.2/app.go:784"}

temporal-frontend logs:

{"level":"info","ts":"2023-02-13T22:38:08.374Z","msg":"history client encountered error","service":"frontend","error":"last connection error: connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\"","service-error-type":"serviceerror.Unavailable","logging-call-at":"metric_client.go:90"}
{"level":"error","ts":"2023-02-13T22:38:08.374Z","msg":"service failures","operation":"StartWorkflowExecution","wf-namespace":"default","error":"last connection error: connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\"","logging-call-at":"telemetry.go:280","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).handleError\n\t/home/builder/temporal/common/rpc/interceptor/telemetry.go:280\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).Intercept\n\t/home/builder/temporal/common/rpc/interceptor/telemetry.go:151\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/metrics.NewServerMetricsContextInjectorInterceptor.func1\n\t/home/builder/temporal/common/metrics/grpc.go:66\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1\n\t/go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.36.1/interceptor.go:352\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/rpc/interceptor.(*NamespaceLogInterceptor).Intercept\n\t/home/builder/temporal/common/rpc/interceptor/namespace_logger.go:84\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/rpc/interceptor.(*NamespaceValidatorInterceptor).LengthValidationIntercept\n\t/home/builder/temporal/common/rpc/interceptor/namespace_validator.go:103\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/rpc.ServiceErrorInterceptor\n\t/home/builder/temporal/common/rpc/grpc.go:137\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1167\ngo.temporal.io/api/workflowservice/v1._WorkflowService_StartWorkflowExecution_Handler\n\t/go/pkg/mod/go.temporal.io/api@v1.13.1-0.20221110200459-6a3cb21a3415/workflowservice/v1/service.pb.go:1464\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1340\ngoogle.golang.org/grpc.(*Server).handleStream\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1713\ngoogle.golang.org/grpc.(*Server).serveStreams.func1.2\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:965"}
{"level":"info","ts":"2023-02-13T22:38:08.433Z","msg":"history client encountered error","service":"frontend","error":"last connection error: connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\"","service-error-type":"serviceerror.Unavailable","logging-call-at":"metric_client.go:90"}
{"level":"error","ts":"2023-02-13T22:38:08.433Z","msg":"service failures","operation":"StartWorkflowExecution","wf-namespace":"default","error":"last connection error: connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\"","logging-call-at":"telemetry.go:280","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:144\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).handleError\n\t/home/builder/temporal/common/rpc/interceptor/telemetry.go:280\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).Intercept\n\t/home/builder/temporal/common/rpc/interceptor/telemetry.go:151\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/metrics.NewServerMetricsContextInjectorInterceptor.func1\n\t/home/builder/temporal/common/metrics/grpc.go:66\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1\n\t/go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.36.1/interceptor.go:352\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/rpc/interceptor.(*NamespaceLogInterceptor).Intercept\n\t/home/builder/temporal/common/rpc/interceptor/namespace_logger.go:84\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/rpc/interceptor.(*NamespaceValidatorInterceptor).LengthValidationIntercept\n\t/home/builder/temporal/common/rpc/interceptor/namespace_validator.go:103\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngo.temporal.io/server/common/rpc.ServiceErrorInterceptor\n\t/home/builder/temporal/common/rpc/grpc.go:137\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1.1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165\ngoogle.golang.org/grpc.chainUnaryInterceptors.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1167\ngo.temporal.io/api/workflowservice/v1._WorkflowService_StartWorkflowExecution_Handler\n\t/go/pkg/mod/go.temporal.io/api@v1.13.1-0.20221110200459-6a3cb21a3415/workflowservice/v1/service.pb.go:1464\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1340\ngoogle.golang.org/grpc.(*Server).handleStream\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1713\ngoogle.golang.org/grpc.(*Server).serveStreams.func1.2\n\t/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:965"}
{"level":"info","ts":"2023-02-13T22:38:08.906Z","msg":"history client encountered error","service":"frontend","error":"last connection error: connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\"","service-error-type":"serviceerror.Unavailable","logging-call-at":"metric_client.go:90"}
{"level":"info","ts":"2023-02-13T22:38:08.937Z","msg":"history client encountered error","service":"frontend","error":"last connection error: connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\"","service-error-type":"serviceerror.Unavailable","logging-call-at":"metric_client.go:90"}

temporal-matching logs are just a bunch of these INFO level logs:

{"level":"info","ts":"2023-02-13T22:34:46.087Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/<omitted by author>/2","wf-task-queue-type":"Workflow","wf-namespace":"default","lifecycle":"Started","logging-call-at":"taskQueueManager.go:291"}

In our first-party application, there are tons of logs indicating that workflows cannot be started:

c.client.ExecuteWorkflow: context deadline exceeded

Output from kubectl get pods:

NAME                                          READY   STATUS      RESTARTS       AGE
pod/temporal-svc-admintools-97fc86655-dj95w   1/1     Running     0              21m
pod/temporal-svc-frontend-786cff5884-dbdmj    1/1     Running     2 (20m ago)    21m
pod/temporal-svc-history-fb7fb6954-8gzxc      1/1     Running     3 (107s ago)   21m
pod/temporal-svc-matching-8849dbd9-6hk7t      1/1     Running     2 (20m ago)    21m
pod/temporal-svc-schema-setup-t5msw           0/2     Completed   0              21m
pod/temporal-svc-schema-update-qgssn          0/2     Completed   4              21m
pod/temporal-svc-web-56f6dcb8cc-h5s59         1/1     Running     0              21m
pod/temporal-svc-worker-66f7f58849-hrlpb      1/1     Running     3 (20m ago)    21m

You can see that the history service was recently restarted. When I inspect the pod using kubectl describe, I see that the termination reason was:

  Warning  Unhealthy    89s (x3 over 109s)  kubelet            Liveness probe failed: dial tcp 10.244.0.16:11403: i/o timeout

Update: this appears to be fixed by Use max join time as start timeout by yux0 · Pull Request #3911 · temporalio/temporal · GitHub

I patched in that change, built a new image, and reran tests. Issues ceased.

Does anyone know whether that PR will make it into the next release? thanks!