Tune your forecasting models
We have default search spaces for some models and we can define default features to look for based on the length of the seasonal period of your data. For this example we’ll use hourly data, for which we’ll set 24 (one day) as the season length.
We can now use these models to predict
unique_id | ds | lgb | ridge | |
---|---|---|---|---|
0 | H1 | 701 | 680.534943 | 604.140123 |
1 | H1 | 702 | 599.038307 | 523.364874 |
2 | H1 | 703 | 572.808421 | 479.174481 |
3 | H1 | 704 | 564.573783 | 444.540062 |
4 | H1 | 705 | 543.046026 | 419.987657 |
And evaluate them
lgb | ridge | |
---|---|---|
SMAPE | 18.78 | 20.00 |
MASE | 5.07 | 1.29 |
OWA | 1.57 | 0.81 |
You can provide your own model with its search space to perform the optimization. The search space should be a function that takes an optuna trial and returns the model parameters.
my_lgb | |
---|---|
SMAPE | 18.67 |
MASE | 4.79 |
OWA | 1.51 |
We internally use BaseEstimator.set_params for each configuration, so if you’re using a scikit-learn pipeline you can tune its parameters as you normally would with scikit-learn’s searches.
ridge | |
---|---|
SMAPE | 18.50 |
MASE | 1.24 |
OWA | 0.76 |
The
MLForecast
class defines the features to build in its constructor. You can tune the
features by providing a function through the init_config
argument,
which will take an optuna trial and produce a configuration to pass to
the
MLForecast
constructor.
AutoRidge | |
---|---|
SMAPE | 13.31 |
MASE | 1.67 |
OWA | 0.71 |
The
MLForecast.fit
method takes some arguments that could improve the forecasting
performance of your models, such as dropna
and static_features
. If
you want to tune those you can provide a function to the fit_config
argument.
AutoLightGBM | |
---|---|
SMAPE | 18.78 |
MASE | 5.07 |
OWA | 1.57 |
After the process has finished the results are available under the
results_
attribute of the
AutoMLForecast
object. There will be one result per model and the best configuration
can be found under the config
user attribute.
There is one optimization process per model. This is because different
models can make use of different features. So after the optimization
process is done for each model the best configuration is used to retrain
the model using all of the data. These final models are
MLForecast
objects and are saved in the models_
attribute.
You can use the
AutoMLForecast.save
method to save the best models found. This produces one directory per
model.
Since each model is an
MLForecast
object you can load it by itself.
Tune your forecasting models
We have default search spaces for some models and we can define default features to look for based on the length of the seasonal period of your data. For this example we’ll use hourly data, for which we’ll set 24 (one day) as the season length.
We can now use these models to predict
unique_id | ds | lgb | ridge | |
---|---|---|---|---|
0 | H1 | 701 | 680.534943 | 604.140123 |
1 | H1 | 702 | 599.038307 | 523.364874 |
2 | H1 | 703 | 572.808421 | 479.174481 |
3 | H1 | 704 | 564.573783 | 444.540062 |
4 | H1 | 705 | 543.046026 | 419.987657 |
And evaluate them
lgb | ridge | |
---|---|---|
SMAPE | 18.78 | 20.00 |
MASE | 5.07 | 1.29 |
OWA | 1.57 | 0.81 |
You can provide your own model with its search space to perform the optimization. The search space should be a function that takes an optuna trial and returns the model parameters.
my_lgb | |
---|---|
SMAPE | 18.67 |
MASE | 4.79 |
OWA | 1.51 |
We internally use BaseEstimator.set_params for each configuration, so if you’re using a scikit-learn pipeline you can tune its parameters as you normally would with scikit-learn’s searches.
ridge | |
---|---|
SMAPE | 18.50 |
MASE | 1.24 |
OWA | 0.76 |
The
MLForecast
class defines the features to build in its constructor. You can tune the
features by providing a function through the init_config
argument,
which will take an optuna trial and produce a configuration to pass to
the
MLForecast
constructor.
AutoRidge | |
---|---|
SMAPE | 13.31 |
MASE | 1.67 |
OWA | 0.71 |
The
MLForecast.fit
method takes some arguments that could improve the forecasting
performance of your models, such as dropna
and static_features
. If
you want to tune those you can provide a function to the fit_config
argument.
AutoLightGBM | |
---|---|
SMAPE | 18.78 |
MASE | 5.07 |
OWA | 1.57 |
After the process has finished the results are available under the
results_
attribute of the
AutoMLForecast
object. There will be one result per model and the best configuration
can be found under the config
user attribute.
There is one optimization process per model. This is because different
models can make use of different features. So after the optimization
process is done for each model the best configuration is used to retrain
the model using all of the data. These final models are
MLForecast
objects and are saved in the models_
attribute.
You can use the
AutoMLForecast.save
method to save the best models found. This produces one directory per
model.
Since each model is an
MLForecast
object you can load it by itself.