temporal version: 1.7.0
es version 7.10.1
logs:
I have two nodes running temporal server as a cluster, bellow error only emmited on one node, another one didn’t have this kind of error.
After change log level from "error " to “info” as bellow, sill no valuable information inside logs, any clue ?
es-visibility:
elasticsearch:
version: "v7"
logLevel: "info"
temporal-server[12542]: {"level":"error","ts":"2021-03-16T13:19:14.728+0800","msg":"Operation failed with internal error.","service":"history","err
or":"UpdateWorkflowExecution: failed to update current execution. Error: assertRunIDAndUpdateCurrentExecution failed. Current RunId was ea0c196d-64d3-4744-8acf-9d70631ba42c, expected 5c714c56-0ee7-4806-a5bd
-eb8d1a6059f6","metric-scope":5,"shard-id":264,"logging-call-at":"persistenceMetricClients.go:676","stacktrace":"go.temporal.io/server/common/log/loggerimpl.(*loggerImpl).Error\n\t/temporal/common/log/logge
rimpl/logger.go:138\ngo.temporal.io/server/common/persistence.(*workflowExecutionPersistenceClient).updateErrorMetric\n\t/temporal/common/persistence/persistenceMetricClients.go:676\ngo.temporal.io/server/c
ommon/persistence.(*workflowExecutionPersistenceClient).UpdateWorkflowExecution\n\t/temporal/common/persistence/persistenceMetricClients.go:279\ngo.temporal.io/server/service/history/shard.(*ContextImpl).Up
dateWorkflowExecution\n\t/temporal/service/history/shard/context_impl.go:532\ngo.temporal.io/server/service/history.(*workflowExecutionContextImpl).updateWorkflowExecutionWithRetry.func1\n\t/temporal/servic
e/history/workflowExecutionContext.go:1043\ngo.temporal.io/server/common/backoff.Retry\n\t/temporal/common/backoff/retry.go:103\ngo.temporal.io/server/service/history.(*workflowExecutionContextImpl).updateW
orkflowExecutionWithRetry\n\t/temporal/service/history/workflowExecutionContext.go:1047\ngo.temporal.io/server/service/history.(*workflowExecutionContextImpl).updateWorkflowExecutionWithNew\n\t/temporal/ser
vice/history/workflowExecutionContext.go:754\ngo.temporal.io/server/service/history.(*workflowExecutionContextImpl).updateWorkflowExecutionAsActive\n\t/temporal/service/history/workflowExecutionContext.go:6
05\ngo.temporal.io/server/service/history.(*timerQueueActiveTaskExecutor).updateWorkflowExecution\n\t/temporal/service/history/timerQueueActiveTaskExecutor.go:599\ngo.temporal.io/server/service/history.(*ti
merQueueActiveTaskExecutor).executeWorkflowBackoffTimerTask\n\t/temporal/service/history/timerQueueActiveTaskExecutor.go:370\ngo.temporal.io/server/service/history.(*timerQueueActiveTaskExecutor).execute\n\
t/temporal/service/history/timerQueueActiveTaskExecutor.go:104\ngo.temporal.io/server/service/history.(*timerQueueActiveProcessorImpl).process\n\t/temporal/service/history/timerQueueActiveProcessor.go:303\n
go.temporal.io/server/service/history.(*taskProcessor).processTaskOnce\n\t/temporal/service/history/taskProcessor.go:258\ngo.temporal.io/server/service/history.(*taskProcessor).processTaskAndAck.func1\n\t/t
emporal/service/history/taskProcessor.go:211\ngo.temporal.io/server/common/backoff.Retry\n\t/temporal/common/backoff/retry.go:103\ngo.temporal.io/server/service/history.(*taskProcessor).processTaskAndAck\n\
t/temporal/service/history/taskProcessor.go:238\ngo.temporal.io/server/service/history.(*taskProcessor).taskWorker\n\t/temporal/service/history/taskProcessor.go:161"}
temporal-server[12542]: {"level":"info","ts":"2021-03-16T13:19:14.731+0800","msg":"Range updated for shardID","service":"history","shard-id":264,"a
ddress":"192.168.44.10:7234","shard-item":"0xc001baef00","shard-range-id":26454,"previous-shard-range-id":26453,"number":27737980929,"next-number":27739029504,"logging-call-at":"context_impl.go:863"}