👍 Use an Azure AI endpoint To use an Azure AI endpoint, set thebase_url
argument:nixtla_client = NixtlaClient(base_url="you azure ai endpoint", api_key="your api_key")
NixtlaClient
,
let’s explore an example using the Peyton Manning dataset.
unique_id | ds | y | |
---|---|---|---|
2764 | 0 | 2015-07-05 | 6.499787 |
2765 | 0 | 2015-07-06 | 6.859615 |
2766 | 0 | 2015-07-07 | 6.881411 |
2767 | 0 | 2015-07-08 | 6.997596 |
2768 | 0 | 2015-07-09 | 7.152269 |
finetune_steps
: Number of steps for finetuning TimeGPT on new
data.finetune_depth
: Level of fine-tuning controlling the quantity of
parameters being fine-tuned (see our in-depth
tutorial)finetune_loss
: Loss function to be used during the fine-tuning
process.h
: Specifies how many steps into the future the forecast is made
for each window.step_size
: Determines the interval between the starting points of
consecutive windows.step_size
is smaller than h
, then we get overlapping
windows. This can make the detection process more robust, as TimeGPT
will see the same time step more than once. However, this comes with a
computational cost, since the same time step will be predicted more than
once.
📘 Balancing h and step_size depends on your data: For frequent, short-lived anomalies, use a smallerh
to focus on short-term predictions and a smallerstep_size
to increase overlap and sensitivity. For smooth trends or long-term patterns, use a largerh
to capture broader anomalies and a largerstep_size
to reduce noise and computational cost.