Customize the training procedure for your modelsmlforecast abstracts away most of the training details, which is useful for iterating quickly. However, sometimes you want more control over the fit parameters, the data that goes into the model, etc. This guide shows how you can train a model in a specific way and then giving it back to mlforecast to produce forecasts with it.
Data setup
Creating forecast object
Generate training set
UseMLForecast.preprocess to generate the training data.
| unique_id | ds | y | lag1 | dayofweek | |
|---|---|---|---|---|---|
| 1 | id_0 | 2000-01-02 | 1.423626 | 0.428973 | 6 |
| 2 | id_0 | 2000-01-03 | 2.311782 | 1.423626 | 0 |
| 3 | id_0 | 2000-01-04 | 3.192191 | 2.311782 | 1 |
| 4 | id_0 | 2000-01-05 | 4.148767 | 3.192191 | 2 |
| 5 | id_0 | 2000-01-06 | 5.028356 | 4.148767 | 3 |
Regular training
Since we don’t want to do anything special in our training process for the linear regression, we can just callMLForecast.fit_models
MLForecast.models_ attribute.
Custom training
Now suppose you also want to train a LightGBM model on the same data, but treating the day of the week as a categorical feature and logging the train loss.Computing forecasts
Now we just assign this model to theMLForecast.models_ dictionary.
Note that you can assign as many models as you want.
MLForecast.predict, mlforecast will use those
models to compute the forecasts.
| unique_id | ds | lr | lgbm | |
|---|---|---|---|---|
| 0 | id_0 | 2000-08-10 | 3.549124 | 5.166797 |
| 1 | id_1 | 2000-04-07 | 3.154285 | 4.252490 |
| 2 | id_2 | 2000-06-16 | 2.880933 | 3.224506 |
| 3 | id_3 | 2000-08-30 | 4.061801 | 0.245443 |
| 4 | id_4 | 2001-01-08 | 2.904872 | 2.225106 |

