-
Notifications
You must be signed in to change notification settings - Fork 1.5k
lib/promscrape: added label_limit global and per job scrape param #7795
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
e130c6d
to
1e01be3
Compare
9a6cd6d
to
642d074
Compare
@AndrewChubatiuk kindly pinging you on this |
5250811
to
a8d1d74
Compare
602cb6c
to
f213ad6
Compare
@hagen1778 please review it |
36365b0
to
6b2a9f6
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
lib/promscrape/scrapework.go
Outdated
dst = appendRow(dst, "scrape_timeout_seconds", sw.Config.ScrapeTimeout.Seconds(), timestamp) | ||
dst = appendRow(dst, "up", float64(am.up), timestamp) | ||
|
||
wc.addRows(sw.Config, dst, timestamp, false) | ||
_ = wc.tryAddRows(sw.Config, dst, timestamp, false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it's impossible case, please add assert: logger.Fatalf("BUG: tryAddRows cannot return error for autoMetrics
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like, it's possible to breach this limit and get panic.
There is a problem, we can either: apply labels limit to auto-metrics as well and log it as error. Or ignore labels limit for it.
I think we should go with 1 option. Next hope storage can apply own limits and drop auto metrics as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
apply labels limit to auto-metrics as well and log it as error
did this. see updated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good overall, please see minor comments
cb24497
to
3eaebd0
Compare
Signed-off-by: hagen1778 <roman@victoriametrics.com>
* remove double update of `scrapesSkippedByLabelLimit` on L:1032 * make processing of label_limit and sample_limit alike * process error on stale series send, as it would give at least some hints on what's going on. Before, the stale series will be just silently dropped. Signed-off-by: hagen1778 <roman@victoriametrics.com>
Signed-off-by: hagen1778 <roman@victoriametrics.com>
…for consistency with `scrape_samples_limit` Signed-off-by: hagen1778 <roman@victoriametrics.com>
…ted metrics Signed-off-by: hagen1778 <roman@victoriametrics.com>
Signed-off-by: hagen1778 <roman@victoriametrics.com>
Co-authored-by: Max Kotliar <kotlyar.maksim@gmail.com>
Signed-off-by: hagen1778 <roman@victoriametrics.com>
Signed-off-by: hagen1778 <roman@victoriametrics.com>
limit of labels amount can be exceeded when adding auto metrics, so it is a bad idea to panic here Signed-off-by: hagen1778 <roman@victoriametrics.com>
d999488
to
6771c84
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Co-authored-by: Andrii Chubatiuk <achubatiuk@victoriametrics.com>
Thanks for contribution! |
) This commit introduces new scrape config option - `label_limit`. Which defines labels count limit for time series. Whole scrape will be discarded In case of limit breach. This option could also be defined with label `__label_limit__` during scrape relabeling process and globally for all scrape targets. Fixes #7660 and #3233
Describe Your Changes
related issues:
label_limit
in scrape config on job level #7660with this change suggestion for #3233 is to
label_limit
to a default value for scrape configurationlabel_limit
with a value, that is up to maxLabelsPerTimeseries if it's definedthis will allow avoid propagation of custom labelLimits into tryPush agent's function
Checklist
The following checks are mandatory: