-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Description
pycaret version checks
-
I have checked that this issue has not already been reported here.
-
I have confirmed this bug exists on the latest version of pycaret.
-
I have confirmed this bug exists on the master branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@master).
Issue Description
I've installed LightGBM GPU using this tutorial and using use_gpu=True
in regression module. When running setup
and compare_models
Many warnings fill the output. I wanted to know:
- Is there anything wrong with my config?
- If not is this result of an internal bug?
- If everything is fine how can I mute the warnings?
Reproducible Example
from pycaret.regression import setup, add_metric, compare_models, tune_model, get_leaderboard, automl, \
ensemble_model, evaluate_model, predict_model, finalize_model, get_current_experiment
setup(
train_data,
target='label',
index=True,
test_data=test_data,
imputation_type='simple',
numeric_imputation='knn',
normalize=True,
# normalize_method='minmax',
html=True,
profile=False,
verbose=False,
session_id=(3964),
feature_selection=False,
feature_selection_method='sequential',
n_features_to_select=20,
polynomial_features=True,
polynomial_degree=1,
remove_multicollinearity=True,
# fold_strategy='stratifiedkfold',
# fold_shuffle=True,,
fold=10,
remove_outliers=False,
transformation=True,
use_gpu=True)
optimize_metric = 'MAE'
bests = compare_models(turbo=False, n_select=4,
sort=optimize_metric, verbose=False)
Expected Behavior
No warnings
Actual Results
[LightGBM] [Warning] There are no meaningful features which satisfy the provided configuration. Decreasing Dataset parameters min_data_in_bin or min_data_in_leaf and re-constructing Dataset might resolve this warning.
[LightGBM] [Info] This is the GPU trainer!!
[LightGBM] [Info] Total Bins 0
[LightGBM] [Info] Number of data points in the train set: 2, number of used features: 0
[LightGBM] [Info] Using GPU Device: NVIDIA GeForce RTX 3070 Ti, Vendor: NVIDIA Corporation
[LightGBM] [Info] Compiling OpenCL Kernel with 16 bins...
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
...
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
[LightGBM] [Info] GPU programs have been built
[LightGBM] [Warning] GPU acceleration is disabled because no non-trivial dense features can be found
[LightGBM] [Info] Start training from score 0.500000
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
...
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
[LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
...
1 warning generated.
1 warning generated.
1 warning generated.
1 warning generated.
[LightGBM] [Warning] bagging_freq is set=6, subsample_freq=0 will be ignored. Current value: bagging_freq=6
[LightGBM] [Warning] feature_fraction is set=0.4035993095467378, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.4035993095467378
[LightGBM] [Warning] bagging_fraction is set=0.6090381962870445, subsample=1.0 will be ignored. Current value: bagging_fraction=0.6090381962870445
[LightGBM] [Warning] bagging_freq is set=6, subsample_freq=0 will be ignored. Current value: bagging_freq=6
[LightGBM] [Warning] feature_fraction is set=0.4035993095467378, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.4035993095467378
[LightGBM] [Warning] bagging_fraction is set=0.6090381962870445, subsample=1.0 will be ignored. Current value: bagging_fraction=0.6090381962870445
[LightGBM] [Info] This is the GPU trainer!!
[LightGBM] [Info] Total Bins 6613
[LightGBM] [Info] Number of data points in the train set: 407, number of used features: 51
[LightGBM] [Info] Using GPU Device: NVIDIA GeForce RTX 3070 Ti, Vendor: NVIDIA Corporation
[LightGBM] [Info] Compiling OpenCL Kernel with 256 bins...
[LightGBM] [Info] GPU programs have been built
[LightGBM] [Info] Size of histogram bin entry: 8
[LightGBM] [Info] 51 dense feature groups (0.02 MB) transferred to GPU in 0.000867 secs. 0 sparse feature groups
[LightGBM] [Info] Start training from score 23.228501
[LightGBM] [Info] Size of histogram bin entry: 8
[LightGBM] [Info] 51 dense feature groups (0.01 MB) transferred to GPU in 0.000790 secs. 0 sparse feature groups
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Size of histogram bin entry: 8
[LightGBM] [Info] 51 dense feature groups (0.01 MB) transferred to GPU in 0.000821 secs. 0 sparse feature groups
...
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Warning] No further splits with positive gain, best gain: -inf
[LightGBM] [Info] Size of histogram bin entry: 8
[LightGBM] [Info] 48 dense feature groups (0.02 MB) transferred to GPU in 0.001392 secs. 0 sparse feature groups
.....
Installed Versions
PyCaret required dependencies:
pip: 22.3.1
setuptools: 65.5.1
pycaret: 3.0.0rc5
IPython: 8.6.0
ipywidgets: 8.0.2
tqdm: 4.64.1
numpy: 1.22.4
pandas: 1.5.2
jinja2: 3.1.2
scipy: 1.8.1
joblib: 1.2.0
sklearn: 1.1.3
pyod: 1.0.6
imblearn: 0.9.1
category_encoders: 2.5.1.post0
lightgbm: 3.3.3.99
numba: 0.56.4
requests: 2.28.1
matplotlib: 3.6.2
scikitplot: 0.3.7
yellowbrick: 1.5
plotly: 5.11.0
kaleido: 0.2.1
statsmodels: 0.13.5
sktime: 0.14.0
tbats: 1.1.1
pmdarima: 2.0.1
psutil: 5.9.4
PyCaret optional dependencies:
shap: Not installed
interpret: Not installed
umap: Not installed
pandas_profiling: Not installed
explainerdashboard: Not installed
autoviz: Not installed
fairlearn: Not installed
xgboost: Not installed
catboost: Not installed
kmodes: Not installed
mlxtend: Not installed
statsforecast: Not installed
tune_sklearn: 0.4.5
ray: 2.1.0
hyperopt: 0.2.7
optuna: 3.0.3
skopt: 0.9.0
mlflow: 1.30.0
gradio: 3.11.0
fastapi: 0.87.0
uvicorn: 0.20.0
m2cgen: 0.10.0
evidently: 0.2.0
nltk: Not installed
pyLDAvis: Not installed
gensim: Not installed
spacy: Not installed
wordcloud: Not installed
textblob: Not installed
fugue: Not installed
streamlit: Not installed
prophet: Not installed