Skip to content

added display for test metrics in predict model #1363

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 17, 2021

Conversation

ngupta23
Copy link
Collaborator

@ngupta23 ngupta23 commented Jun 16, 2021

Describe the changes you've made

  • predict_model displays the test metrics now when verbose is set to True (default)
  • Note that this is slightly different from the regression.classification case. There are several corner cases such as:
    • If the model has not been finalized, prediction match y_test
    • If the model has been finalized, predictions will not match the indices from y_test (metrics display NaN in that case)
    • Also, the user can use a different fh length in predict compared to what was used in setup. In this case, only the common indices in prediction and test are used to compute the metrics.

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Code style update (formatting, local variables)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Unit test has been added

Describe if there is any unusual behaviour of your code

Warnings are issued for all corner cases identified above

Checklist:

  • My code follows the style guidelines of this project.
  • I have performed a self-review of my own code.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have made corresponding changes to the documentation.
  • My changes generate no new warnings.
  • I have added tests that prove my fix is effective or that my feature works.
  • New and existing unit tests pass locally with my changes.
  • Any dependent changes have been merged and published in downstream modules.

@Yard1
Copy link
Member

Yard1 commented Jun 16, 2021

@ngupta23 Thanks, I'll edit create_model and we can merge this

@Yard1
Copy link
Member

Yard1 commented Jun 16, 2021

@ngupta23 Displaying the frame returned by predict_model if cross_validation=False

verbose=verbose,
html_param=self.html_param,
)
except:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we catch here the expected error instead of catching all kind of errors ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not delved into this part. This is copied over from the regression module. @Yard1 any thoughts on this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be a good idea, but I can't remember what sort of an error can pop up there

@Yard1 Yard1 merged commit a3c68d4 into time_series Jun 17, 2021
@ngupta23 ngupta23 deleted the predict_model_verbose branch June 18, 2021 01:51
@ngupta23 ngupta23 added time_series Topics related to the time series and removed time_series_dev labels Sep 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
time_series Topics related to the time series
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants