Save and Load Models
Saving and loading trained Deep Learning models has multiple valuable uses. These models are often costly to train; storing a pre-trained model can help reduce costs as it can be loaded and reused to forecast multiple times. Moreover, it enables Transfer learning capabilities, consisting of pre-training a flexible model on a large dataset and using it later on other data with little to no training. It is one of the most outstanding 🚀 achievements in Machine Learning 🧠 and has many practical applications.
In this notebook we show an example on how to save and load
NeuralForecast
models.
The two methods to consider are:
1.
NeuralForecast.save
:
Saves models into disk, allows save dataset and config.
2.
NeuralForecast.load
:
Loads models from a given path.
Important
This Guide assumes basic knowledge on the NeuralForecast library. For a minimal example visit the Getting Started guide.
You can run these experiments using GPU with Google Colab.
1. Installing NeuralForecast
2. Loading AirPassengers Data
For this example we will use the classical AirPassenger Data
set. Import
the pre-processed AirPassenger from utils
.
3. Model Training
Next, we instantiate and train three models:
NBEATS
,
NHITS
,
and
AutoMLP
.
The models with their hyperparameters are defined in the models
list.
Produce the forecasts with the predict
method.
unique_id | ds | NBEATS | NHITS | AutoMLP | |
---|---|---|---|---|---|
0 | 1.0 | 1961-01-31 | 428.410553 | 445.268158 | 452.550446 |
1 | 1.0 | 1961-02-28 | 425.958557 | 469.293945 | 442.683807 |
2 | 1.0 | 1961-03-31 | 477.748016 | 462.920807 | 474.043457 |
3 | 1.0 | 1961-04-30 | 477.548798 | 489.986633 | 503.836334 |
4 | 1.0 | 1961-05-31 | 495.973541 | 518.612610 | 531.347900 |
We plot the forecasts for each model. Note how the two
NBEATS
models are differentiated with a numerical suffix.
4. Save models
To save all the trained models use the save
method. This method will
save both the hyperparameters and the learnable weights (parameters).
The save
method has the following inputs:
path
: directory where models will be saved.model_index
: optional list to specify which models to save. For example, to only save theNHITS
model usemodel_index=[2]
.overwrite
: boolean to overwrite existing files inpath
. When True, the method will only overwrite models with conflicting names.save_dataset
: boolean to saveDataset
object with the dataset.
For each model, two files are created and stored:
[model_name]_[suffix].ckpt
: Pytorch Lightning checkpoint file with the model parameters and hyperparameters.[model_name]_[suffix].pkl
: Dictionary with configuration attributes.
Where model_name
corresponds to the name of the model in lowercase
(eg. nhits
). We use a numerical suffix to distinguish multiple models
of each class. In this example the names will be automlp_0
,
nbeats_0
, and nhits_0
.
Important
The
Auto
models will be stored as their base model. For example, theAutoMLP
trained above is stored as anMLP
model, with the best hyparparameters found during tuning.
5. Load models
Load the saved models with the load
method, specifying the path
, and
use the new nf2
object to produce forecasts.
unique_id | ds | MLP | NHITS | NBEATS | |
---|---|---|---|---|---|
0 | 1.0 | 1961-01-31 | 452.550446 | 445.268158 | 428.410553 |
1 | 1.0 | 1961-02-28 | 442.683807 | 469.293945 | 425.958557 |
2 | 1.0 | 1961-03-31 | 474.043457 | 462.920807 | 477.748016 |
3 | 1.0 | 1961-04-30 | 503.836334 | 489.986633 | 477.548798 |
4 | 1.0 | 1961-05-31 | 531.347900 | 518.612610 | 495.973541 |
Finally, plot the forecasts to confirm they are identical to the original forecasts.
References
https://pytorch-lightning.readthedocs.io/en/stable/common/checkpointing_basic.html