Steps | Description | MAE | MAE Improvement (%) | RMSE | RMSE Improvement (%) |
---|---|---|---|---|---|
0 | Zero-Shot TimeGPT | 18.5 | N/A | 20.0 | N/A |
1 | Add Fine-Tuning Steps | 11.5 | 38% | 12.6 | 37% |
2 | Adjust Fine-Tuning Loss | 9.6 | 48% | 11.0 | 45% |
3 | Fine-tune more parameters | 9.0 | 51% | 11.3 | 44% |
4 | Add Exogenous Variables | 4.6 | 75% | 6.4 | 68% |
5 | Switch to Long-Horizon Model | 6.4 | 65% | 7.7 | 62% |
unique_id | metric | TimeGPT | |
---|---|---|---|
0 | DE | mae | 18.519004 |
1 | DE | rmse | 20.037751 |
unique_id | metric | TimeGPT | |
---|---|---|---|
0 | DE | mae | 11.458185 |
1 | DE | rmse | 12.642999 |
finetune_loss
parameter. By modifying the loss
function, we observe that the MAE decreases to 9.6 and the RMSE reduces
to 11.0.
unique_id | metric | TimeGPT | |
---|---|---|---|
0 | DE | mae | 9.640649 |
1 | DE | rmse | 10.956003 |
finetune_depth
parameter, we can control the number of
parameters that get fine-tuned. By default, finetune_depth=1
, meaning
that few parameters are tuned. We can set it to any value from 1 to 5,
where 5 means that we fine-tune all of the parameters of the model.
unique_id | metric | TimeGPT | |
---|---|---|---|
0 | DE | mae | 9.002193 |
1 | DE | rmse | 11.348207 |
unique_id | ds | y | Exogenous1 | Exogenous2 | day_0 | day_1 | day_2 | day_3 | day_4 | day_5 | day_6 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
1680 | DE | 2017-10-22 00:00:00 | 19.10 | 16972.75 | 15778.92975 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
1681 | DE | 2017-10-22 01:00:00 | 19.03 | 16254.50 | 16664.20950 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
1682 | DE | 2017-10-22 02:00:00 | 16.90 | 15940.25 | 17728.74950 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
1683 | DE | 2017-10-22 03:00:00 | 12.98 | 15959.50 | 18578.13850 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
1684 | DE | 2017-10-22 04:00:00 | 9.24 | 16071.50 | 19389.16750 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
unique_id | ds | Exogenous1 | Exogenous2 | day_0 | day_1 | day_2 | day_3 | day_4 | day_5 | day_6 | |
---|---|---|---|---|---|---|---|---|---|---|---|
3312 | DE | 2017-12-29 00:00:00 | 17347.00 | 24577.92650 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
3313 | DE | 2017-12-29 01:00:00 | 16587.25 | 24554.31950 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
3314 | DE | 2017-12-29 02:00:00 | 16396.00 | 24651.45475 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
3315 | DE | 2017-12-29 03:00:00 | 16481.25 | 24666.04300 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
3316 | DE | 2017-12-29 04:00:00 | 16827.75 | 24403.33350 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
unique_id | metric | TimeGPT | |
---|---|---|---|
0 | DE | mae | 4.602594 |
1 | DE | rmse | 6.358831 |
unique_id | metric | TimeGPT | |
---|---|---|---|
0 | DE | mae | 6.365540 |
1 | DE | rmse | 7.738188 |
Steps | Description | MAE | MAE Improvement (%) | RMSE | RMSE Improvement (%) |
---|---|---|---|---|---|
0 | Zero-Shot TimeGPT | 18.5 | N/A | 20.0 | N/A |
1 | Add Fine-Tuning Steps | 11.5 | 38% | 12.6 | 37% |
2 | Adjust Fine-Tuning Loss | 9.6 | 48% | 11.0 | 45% |
3 | Fine-tune more parameters | 9.0 | 51% | 11.3 | 44% |
4 | Add Exogenous Variables | 4.6 | 75% | 6.4 | 68% |
5 | Switch to Long-Horizon Model | 6.4 | 65% | 7.7 | 62% |