Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: saving and re-using fine-tuned models #562

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open

Conversation

jmoralez
Copy link
Member

@jmoralez jmoralez commented Dec 9, 2024

Adds the following:

  • NixtlaClient.finetune to fine tune and save a model.
  • NixtlaClient.finetuned_models to list the models a user has fine tuned.
  • The finetuned_model_id argument to all methods to use a saved fine tuned model to compute forecasts.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Contributor

github-actions bot commented Dec 9, 2024

Experiment Results

Experiment 1: air-passengers

Description:

variable experiment
h 12
season_length 12
freq MS
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 12.6793 11.0623 47.8333 76
mape 0.027 0.0232 0.0999 0.1425
mse 213.936 199.132 2571.33 10604.2
total_time 0.5996 0.6793 0.0043 0.0034

Plot:

Experiment 2: air-passengers

Description:

variable experiment
h 24
season_length 12
freq MS
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 58.1031 58.4587 71.25 115.25
mape 0.1257 0.1267 0.1552 0.2358
mse 4040.21 4110.79 5928.17 18859.2
total_time 0.4592 0.4409 0.0038 0.0034

Plot:

Experiment 3: electricity-multiple-series

Description:

variable experiment
h 24
season_length 24
freq H
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 178.293 268.13 269.23 1331.02
mape 0.0234 0.0311 0.0304 0.1692
mse 121589 219485 213677 4.68961e+06
total_time 0.9296 2.1388 0.0047 0.0041

Plot:

Experiment 4: electricity-multiple-series

Description:

variable experiment
h 168
season_length 24
freq H
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 465.497 346.972 398.956 1119.26
mape 0.062 0.0436 0.0512 0.1583
mse 835021 403760 656723 3.17316e+06
total_time 1.3464 1.3336 0.0049 0.0043

Plot:

Experiment 5: electricity-multiple-series

Description:

variable experiment
h 336
season_length 24
freq H
level None
n_windows 1

Results:

metric timegpt-1 timegpt-1-long-horizon SeasonalNaive Naive
mae 558.673 459.757 602.926 1340.95
mape 0.0697 0.0565 0.0787 0.17
mse 1.22723e+06 739114 1.61572e+06 6.04619e+06
total_time 0.7606 0.9921 0.0049 0.0044

Plot:

@jmoralez jmoralez marked this pull request as ready for review December 10, 2024 00:04
Copy link
Member

@AzulGarza AzulGarza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very cool pr!

when using distributed frameworks, we would expect an error. should we raise a particular error in that case?

@jmoralez
Copy link
Member Author

Trying to fine tune on a distributed df?

@jmoralez jmoralez requested a review from AzulGarza December 10, 2024 17:26
@marcopeix
Copy link
Contributor

Unsolicited comments/questions:

  • Why is num_partitions not supported for the finetune method? Would this limit the finetuning process on small-ish datasets?
  • Is it relevant to add a test that compares finetuning in the forecast method and finetuning separately and using the finetuned model to ensure the predictions are the same?

@jmoralez
Copy link
Member Author

  • Supporting partitioning would fine tune num_partitions models on unknown portions of the original data. I don't think that's a good use case.
  • We already have that test in the API.

Copy link
Member

@AzulGarza AzulGarza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants