-
Notifications
You must be signed in to change notification settings - Fork 611
Description
Describe the bug
Running TraceQL queries {trace:id = "<some trace id>"}
does not return data for some trace ids.
I have this trace id in Tempo: fbb99bfc59174ccefa0338d4886914
. I can run the TraceQL for the trace (fbb99bfc59174ccefa0338d4886914
) and the data for the trace is returned. Additionally, if I run a query for all traces ({}
) I see it in the returned table. But if I run {trace:id = "fbb99bfc59174ccefa0338d4886914"}
I get "0 series returned".
This only seems to affect some trace ids though, as this one 7fbea980725df26097c73beeaa8ba8
returns the expected trace in the table view when running {trace:id = "7fbea980725df26097c73beeaa8ba8"}
.
To Reproduce
Steps to reproduce the behavior:
I'm not exactly sure how to reproduce this as it appears only some trace ids are affected. I left information under the bug description.
Expected behavior
I expect a query for a known trace id like {trace:id = "fbb99bfc59174ccefa0338d4886914"}
to return data.
Environment:
- Infrastructure: Kubernetes
- Deployment tool: helm
- Tempo version: 2.7.1
- Grafana version: 11.6.0
Additional Context
Looked at logs for the query-frontend and didn't see anything obvious for why one would not return data.
Running TraceQL query {trace:id = "7fbea980725df26097c73beeaa8ba8"}
:
level=info ts=2025-04-08T00:18:18.684153195Z caller=search_handlers.go:191 msg="search request" tenant=single-tenant query="{trace:id = \"7fbea980725df26097c73beeaa8ba8\"}" range_seconds=10800 limit=20 spans_per_spanset=3
level=info ts=2025-04-08T00:18:18.688416552Z caller=search_handlers.go:172 msg="search response" tenant=single-tenant query="{trace:id = \"7fbea980725df26097c73beeaa8ba8\"}" range_seconds=10800 duration_seconds=0.004263983 request_throughput=2.186875510526191e+07 total_requests=6 total_blockBytes=95262 total_blocks=3 completed_requests=6 inspected_bytes=93248 inspected_traces=0 inspected_spans=0 status_code=200 error=null
level=info ts=2025-04-08T00:18:18.688568056Z caller=handler.go:134 tenant=single-tenant method=GET traceID= url="/api/search?q=%7Btrace%3Aid%20%3D%20%227fbea980725df26097c73beeaa8ba8%22%7D&limit=20&spss=3&start=1744060698&end=1744071498" duration=4.310987ms response_size=1170 status=200
Running TraceQL query {trace:id = "fbb99bfc59174ccefa0338d4886914"}
:
level=info ts=2025-04-08T00:18:29.278219615Z caller=search_handlers.go:191 msg="search request" tenant=single-tenant query="{trace:id = \"fbb99bfc59174ccefa0338d4886914\"}" range_seconds=10800 limit=20 spans_per_spanset=3
level=info ts=2025-04-08T00:18:29.282487066Z caller=search_handlers.go:172 msg="search response" tenant=single-tenant query="{trace:id = \"fbb99bfc59174ccefa0338d4886914\"}" range_seconds=10800 duration_seconds=0.004268084 request_throughput=2.0962099152687717e+07 total_requests=6 total_blockBytes=95262 total_blocks=3 completed_requests=6 inspected_bytes=89468 inspected_traces=0 inspected_spans=0 status_code=200 error=null
level=info ts=2025-04-08T00:18:29.282527334Z caller=handler.go:134 tenant=single-tenant method=GET traceID= url="/api/search?q=%7Btrace%3Aid%20%3D%20%22fbb99bfc59174ccefa0338d4886914%22%7D&limit=20&spss=3&start=1744060708&end=1744071508" duration=4.309024ms response_size=124 status=200
I also saw this issue #4437 and was wondering if this was related.