Hi,
I would observer couple of common errors with the Temporal Server v1.22.0. DB - Cockroach and conigured Shards - 512. internal-frontend also enabled along with auth enabled.
may i know what is causing the issue?
1. Matching
{“level”:“error”,“ts”:“2023-09-28T17:12:33.460Z”,“msg”:“service failures”,“operation”:“GetTaskQueueUserData”,“wf-namespace”:“temporal-system”,“error”:“task queue closed”,“logging-call-at”:“telemetry.go:328”,“stacktrace”:“go.temporal.io/server/common/log.(*zapLogger).Error\n\t/temporal/common/log/zap_logger.go:156\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).handleError\n\t/temporal/common/rpc/interceptor/telemetry.go:328\ngo.temporal.io/server/common/rpc/interceptor.(*TelemetryInterceptor).UnaryIntercept\n\t/temporal/common/rpc/interceptor/telemetry.go:169\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/temporal/vendor/google.golang.org/grpc/server.go:1179\ngo.temporal.io/server/service.GrpcServerOptionsProvider.getUnaryInterceptors.NewServerMetricsTrailerPropagatorInterceptor.func4\n\t/temporal/common/metrics/grpc.go:113\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/temporal/vendor/google.golang.org/grpc/server.go:1179\ngo.temporal.io/server/service.GrpcServerOptionsProvider.getUnaryInterceptors.NewServerMetricsContextInjectorInterceptor.func3\n\t/temporal/common/metrics/grpc.go:66\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/temporal/vendor/google.golang.org/grpc/server.go:1179\ngo.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1\n\t/temporal/vendor/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc/interceptor.go:344\ngoogle.golang.org/grpc.getChainUnaryHandler.func1\n\t/temporal/vendor/google.golang.org/grpc/server.go:1179\ngo.temporal.io/server/common/rpc.ServiceErrorInterceptor\n\t/temporal/common/rpc/grpc.go:145\ngoogle.golang.org/grpc.NewServer.chainUnaryServerInterceptors.chainUnaryInterceptors.func1\n\t/temporal/vendor/google.golang.org/grpc/server.go:1170\ngo.temporal.io/server/api/matchingservice/v1._MatchingService_GetTaskQueueUserData_Handler\n\t/temporal/api/matchingservice/v1/service.pb.go:660\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\t/temporal/vendor/google.golang.org/grpc/server.go:1360\ngoogle.golang.org/grpc.(*Server).handleStream\n\t/temporal/vendor/google.golang.org/grpc/server.go:1737\ngoogle.golang.org/grpc.(*Server).serveStreams.func1.1\n\t/temporal/vendor/google.golang.org/grpc/server.go:982”}
2. History
{“level”:“error”,“ts”:“2023-09-28T17:07:43.031Z”,“msg”:“Unable to process new range”,“shard-id”:344,“address”:“100.127.123.67:7234”,“component”:“timer-queue-processor”,“error”:“shard status unknown”,“logging-call-at”:“queue_base.go:316”,“stacktrace”:“go.temporal.io/server/common/log.(*zapLogger).Error\n\t/temporal/common/log/zap_logger.go:156\ngo.temporal.io/server/service/history/queues.(*queueBase).processNewRange\n\t/temporal/service/history/queues/queue_base.go:316\ngo.temporal.io/server/service/history/queues.(*scheduledQueue).processEventLoop\n\t/temporal/service/history/queues/queue_scheduled.go:218”}