Skip to content

grid search with per_float_feature_quantization exports parameters wrong #1833

@acrofales

Description

@acrofales

Problem:
when using per_float_feature_quantization as one of the hyperparameters to optimize in grid_search, the parameter is exported wrong:
{'params': {'depth': 6, 'verbose': 200, 'l2_leaf_reg': 1, 'iterations': 500, 'learning_rate': 0.1, 'per_float_feature_quantization': [0.0]},
This causes a problem if running the grid_search with refit=True. However, it is similarly problematic if you extract the params and try to refit manually (or do anything else with the optimal hyperparameters).

A workaround is to inspect the catboost_training.json file and find the run with the best performance, then copy the parameters from there. However, this is quite tedious.

catboost version: 0.26
Operating System: linux
CPU: Tried on a local laptop Intel Core i5, as well as a google compute engine n1-standard-4

E: I switched to sklearn GridSearchCV instead, and then I get an error related to the fitting which was probably suppressed in the catboost gridsearch that I cannot use None for per_float_feature_quantization. Perhaps this is related?
In which case, I guess a related issue is: how can I include None as a potential value for per_float_feature_quantization in a grid search? I tried using an empty string, but that isn't accepted either.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions