1. Import packages
First, we import the required packages and initialize the Nixtla client2. Load data
unique_id | ds | y | |
---|---|---|---|
0 | H1 | 1 | 605.0 |
1 | H1 | 2 | 586.0 |
2 | H1 | 3 | 586.0 |
3 | H1 | 4 | 559.0 |
4 | H1 | 5 | 511.0 |
3. Zero-shot forecast
We can try forecasting without any finetuning to see how well TimeGPT does.metric | TimeGPT | |
---|---|---|
0 | rmse | 1504.474342 |
4. Fine-tune
We can now fine-tune TimeGPT a little and save our model for later use. We can define the ID that we want that model to have by providing it throughoutput_model_id
.
finetuned_model_id
argument.
metric | TimeGPT_zero_shot | TimeGPT_first_finetune | |
---|---|---|---|
0 | rmse | 1504.474342 | 1472.024619 |
5. Further fine-tune
We can now take this model and fine-tune it a bit further by using theNixtlaClient.finetune
method but providing our already fine-tuned model as
finetuned_model_id
, which will take that model and fine-tune it a bit
more. We can also change the fine-tuning settings, like using
finetune_depth=3
, for example.
output_model_id
this time, it got assigned an
UUID.
We can now use this model to forecast.
metric | TimeGPT_first_finetune | TimeGPT_second_finetune | |
---|---|---|---|
0 | rmse | 1472.024619 | 1435.365211 |
6. Listing fine-tuned models
We can list our fine-tuned models with theNixtlaClient.finetuned_models
method.
as_df=True
.
id | created_at | created_by | base_model_id | steps | depth | loss | model | freq | |
---|---|---|---|---|---|---|---|---|---|
0 | 468b13fb-4b26-447a-bd87-87a64b50d913 | 2024-12-30 17:57:31.241455+00:00 | user | my-first-finetuned-model | 10 | 3 | default | timegpt-1-long-horizon | MS |
1 | my-first-finetuned-model | 2024-12-30 17:57:16.978907+00:00 | user | None | 10 | 1 | default | timegpt-1-long-horizon | MS |
base_model_id
of our second model is our first
model, along with other metadata.
7. Deleting fine-tuned models
In order to keep things organized, and since there’s a limit of 50 fine-tuned models, you can delete models that weren’t so promising to make room for more experiments. For example, we can delete our first finetuned model. Note that even though it was used as the base for our second model, they’re saved independently so removing it won’t affect our second model, except for the dangling metadata.id | created_at | created_by | base_model_id | steps | depth | loss | model | freq | |
---|---|---|---|---|---|---|---|---|---|
0 | 468b13fb-4b26-447a-bd87-87a64b50d913 | 2024-12-30 17:57:31.241455+00:00 | user | my-first-finetuned-model | 10 | 3 | default | timegpt-1-long-horizon | MS |