Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release Process : Pre-release Benchmarks #194

Closed
antoinecarme opened this issue Apr 8, 2022 · 1 comment
Closed

Release Process : Pre-release Benchmarks #194

antoinecarme opened this issue Apr 8, 2022 · 1 comment

Comments

@antoinecarme
Copy link
Owner

antoinecarme commented Apr 8, 2022

Following #176 and #188 :

Need to update PyAF benchmarks with the results for the next release.

https://github.com/antoinecarme/PyAF_Benchmarks

Some refactoring is needed to automate the benchmark execution and integration.

A better Reporting is also needed.

  1. Add a report about training times and their evolution between the previous releases.
  2. Add a global quality report (MAPE + L1 distribution/quantiles for all releases and all benchmark datasets)
  3. A summary report presenting the results (Jupyter notebook)

Target Release : 2022-07-14

@antoinecarme antoinecarme self-assigned this Apr 8, 2022
@antoinecarme antoinecarme changed the title Release Process : Pre-release Benchamrks Release Process : Pre-release Benchmarks Apr 8, 2022
@antoinecarme
Copy link
Owner Author

FIXED

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant