I’ve deployed temporal using the Helm charts from temporalio/helm-charts
Our clusters have Istio and the istio sidecar proxy is injected automatically during deployment.
I have no issues in bringing up the temporal services, but after 60s when the PollActivityTaskQueue thread kicks in, the temporal-frontend services starts throwing errors like server closed the stream without sending trailers
.
Here is an event of such error. Any idea what is wrong? Please advise. Thanks.
{ "level": "error", "ts": "2021-03-14T18:24:40.832Z", "msg": "PollActivityTaskQueue failed.", "service": "frontend", "wf-task-queue-name": "/_sys/temporal-sys-tq-scanner-taskqueue-0/3", "value": "1m9.999803152s", "error": "server closed the stream without sending trailers", "logging-call-at": "workflowHandler.go:1045", "stacktrace": "go.temporal.io/server/common/log/loggerimpl.(*loggerImpl).Error\n\t/temporal/common/log/loggerimpl/logger.go:138\ngo.temporal.io/server/service/frontend.(*WorkflowHandler).PollActivityTaskQueue\n\t/temporal/service/frontend/workflowHandler.go:1045\ngo.temporal.io/server/service/frontend.(*DCRedirectionHandlerImpl).PollActivityTaskQueue.func2\n\t/temporal/service/frontend/dcRedirectionHandler.go:502\ngo.temporal.io/server/service/frontend.(*NoopRedirectionPolicy).WithNamespaceRedirect\n\t/temporal/service/frontend/dcRedirectionPolicy.go:116\ngo.temporal.io/server/service/frontend.(*DCRedirectionHandlerImpl).PollActivityTaskQueue\n\t/temporal/service/frontend/dcRedirectionHandler.go:498\ngo.temporal.io/api/workflowservice/v1._WorkflowService_PollActivityTaskQueue_Handler.func1\n\t/go/pkg/mod/go.temporal.io/api@v1.4.0/workflowservice/v1/service.pb.go:1137\ngo.temporal.io/server/common/authorization.(*interceptor).Interceptor\n\t/temporal/common/authorization/interceptor.go:136\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1051\ngo.temporal.io/server/common/rpc/interceptor.(*NamespaceCountLimitInterceptor).Intercept\n\t/temporal/common/rpc/interceptor/namespace_count_limit.go:84\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1051\ngo.temporal.io/server/common/rpc/interceptor.(*NamespaceRateLimitInterceptor).Intercept\n\t/temporal/common/rpc/interceptor/namespace_rate_limit.go:85\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1051\ngo.temporal.io/server/common/rpc/interceptor.(*RateLimitInterceptor).Intercept\n\t/temporal/common/rpc/interceptor/rate_limit.go:79\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1051\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).Intercept\n\t/temporal/common/rpc/interceptor/telemetry.go:91\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1051\ngo.temporal.io/server/common/rpc.ServiceErrorInterceptor\n\t/temporal/common/rpc/grpc.go:100\ngoogle.golang.org/grpc.chainUnaryServerInterceptors.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1037\ngo.temporal.io/api/workflowservice/v1._WorkflowService_PollActivityTaskQueue_Handler\n\t/go/pkg/mod/go.temporal.io/api@v1.4.0/workflowservice/v1/service.pb.go:1139\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1210\ngoogle.golang.org/grpc.(*Server).handleStream\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:1533\ngoogle.golang.org/grpc.(*Server).serveStreams.func1.2\n\t/go/pkg/mod/google.golang.org/grpc@v1.34.0/server.go:871" }