finetune_steps
argument of the forecast
method.
👍 Use an Azure AI endpoint To use an Azure AI endpoint, remember to set also thebase_url
argument:nixtla_client = NixtlaClient(base_url="you azure ai endpoint", api_key="your api_key")
timestamp | value | |
---|---|---|
0 | 1949-01-01 | 112 |
1 | 1949-02-01 | 118 |
2 | 1949-03-01 | 132 |
3 | 1949-04-01 | 129 |
4 | 1949-05-01 | 121 |
finetune_steps=10
means the model will go through 10 iterations
of training on your time series data.
📘 Available models in Azure AI If you are using an Azure AI endpoint, please be sure to setmodel="azureai"
:nixtla_client.forecast(..., model="azureai")
For the public API, we support two models:timegpt-1
andtimegpt-1-long-horizon
. By default,timegpt-1
is used. Please see this tutorial on how and when to usetimegpt-1-long-horizon
.
finetune_steps
based on your specific
needs and the complexity of your data. Usually, a larger value of
finetune_steps
works better for large datasets.
It’s recommended to monitor the model’s performance during fine-tuning
and adjust as needed. Be aware that more finetune_steps
may lead to
longer training times and could potentially lead to overfitting if not
managed properly.
Remember, fine-tuning is a powerful feature, but it should be used
thoughtfully and carefully.
For a detailed guide on using a specific loss function for fine-tuning,
check out the Fine-tuning with a specific loss
function
tutorial.
Read also our detailed tutorial on controlling the level of
fine-tuning
using finetune_depth
.