Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Add serialization for incremental linear models #2211

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

olegkkruglov
Copy link
Contributor

@olegkkruglov olegkkruglov commented Dec 4, 2024

Description

  • Added __getstate__ method to IncrementalLinearRegression and IncrementalRidge
  • Removed redundant check for onedal table backend in onedal/datatypes/data_conversion.cpp. It is necessary because partial result for linear models contains column-major tables on C++ side. Serialization tests confirm conversion actually works well for column-major tables as well.
  • Finalization is called with serialization during __getstate__ method
  • _need_to_finalize flag is added to avoid unnecessary call of finalization backend.
  • Added test for checking if it works well
  • Updated deselected_tests.yaml

PR should start as a draft, then move to ready for review state after CI is passed and all applicable checkboxes are closed.
This approach ensures that reviewers don't spend extra time asking for regular requirements.

You can remove a checkbox as not applicable only if it doesn't relate to this PR in any way.
For example, PR with docs update doesn't require checkboxes for performance while PR with any change in actual code should have checkboxes and justify how this code change is expected to affect performance (or justification should be self-evident).

Checklist to comply with before moving PR from draft:

PR completeness and readability

  • I have reviewed my changes thoroughly before submitting this pull request.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have updated the documentation to reflect the changes or created a separate PR with update and provided its number in the description, if necessary.
  • Git commit message contains an appropriate signed-off-by string (see CONTRIBUTING.md for details).
  • I have added a respective label(s) to PR if I have a permission for that.
  • I have resolved any merge conflicts that might occur with the base branch.

Testing

  • I have run it locally and tested the changes extensively.
  • All CI jobs are green or I have provided justification why they aren't.
  • I have extended testing suite if new functionality was introduced in this PR.

Performance

  • I have measured performance for affected algorithms using scikit-learn_bench and provided at least summary table with measured data, if performance change is expected.
  • I have provided justification why performance has changed or why changes are not expected.
  • I have provided justification why quality metrics have changed or why changes are not expected.
  • I have extended benchmarking suite and provided corresponding scikit-learn_bench PR if new measurable functionality was introduced in this PR.

@olegkkruglov olegkkruglov requested a review from DDJHB December 4, 2024 14:17
@olegkkruglov olegkkruglov marked this pull request as ready for review December 4, 2024 14:18
@icfaust
Copy link
Contributor

icfaust commented Dec 5, 2024

/intelci: run

@@ -103,6 +103,13 @@ class IncrementalLinearRegression(
n_features_in_ : int
Number of features seen during ``fit`` or ``partial_fit``.

Note
Copy link
Contributor

@david-cortes-intel david-cortes-intel Dec 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this note will appear in multiple classes, perhaps it could be moved into a common variable and the docstrings modified programmatically for all the classes that will have the note.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is good idea. I think it could be done in a separate PR moving some common things for all incremental algorithms to separate classes/variables.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants