All the NeuralForecast models are “global” because we train them with all the series from the input pd.DataFrame data Y_df, yet the optimization objective is, momentarily, “univariate” as it does not consider the interaction between the output predictions across time series. Like the StatsForecast library, core.NeuralForecast allows you to explore collections of models efficiently and contains functions for convenient wrangling of input and output pd.DataFrames predictions.

First we load the AirPassengers dataset such that you can run all the examples.

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

from neuralforecast.tsdataset import TimeSeriesDataset
from neuralforecast.utils import AirPassengersDF as Y_df
# Split train/test and declare time series dataset
Y_train_df = Y_df[Y_df.ds<='1959-12-31'] # 132 train
Y_test_df = Y_df[Y_df.ds>'1959-12-31']   # 12 test
dataset, *_ = TimeSeriesDataset.from_df(Y_train_df)

1. Automatic Forecasting

A. RNN-Based


source

AutoRNN

 AutoRNN (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7fc000ecd6f0>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000ecd6f0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoRNN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoRNN(h=12, config=config, num_samples=1, cpus=1)

model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoRNN(h=12, config=None, num_samples=1, cpus=1, backend='optuna')

source

AutoLSTM

 AutoLSTM (h, loss=MAE(), valid_loss=None, config=None,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7fc00006ded0>, num_samples=10,
           refit_with_val=False, cpus=4, gpus=0, verbose=False,
           alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00006ded0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoLSTM.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoLSTM(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoLSTM(h=12, config=None, backend='optuna')

source

AutoGRU

 AutoGRU (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7fc0011b8190>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0011b8190>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoGRU.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoGRU(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoGRU(h=12, config=None, backend='optuna')

source

AutoTCN

 AutoTCN (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7fc0002a4190>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0002a4190>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTCN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoTCN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTCN(h=12, config=None, backend='optuna')

source

AutoDeepAR

 AutoDeepAR (h, loss=DistributionLoss(), valid_loss=MQLoss(), config=None,
             search_alg=<ray.tune.search.basic_variant.BasicVariantGenerat
             or object at 0x7fc00005df00>, num_samples=10,
             refit_with_val=False, cpus=4, gpus=0, verbose=False,
             alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossDistributionLossDistributionLoss()Instantiated train loss class from losses collection.
valid_lossMQLossMQLoss()Instantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00005df00>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDeepAR.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, lstm_hidden_size=8)
model = AutoDeepAR(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDeepAR(h=12, config=None, backend='optuna')

source

AutoDilatedRNN

 AutoDilatedRNN (h, loss=MAE(), valid_loss=None, config=None,
                 search_alg=<ray.tune.search.basic_variant.BasicVariantGen
                 erator object at 0x7fc00029fc70>, num_samples=10,
                 refit_with_val=False, cpus=4, gpus=0, verbose=False,
                 alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00029fc70>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDilatedRNN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoDilatedRNN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDilatedRNN(h=12, config=None, backend='optuna')

source

AutoBiTCN

 AutoBiTCN (h, loss=MAE(), valid_loss=None, config=None,
            search_alg=<ray.tune.search.basic_variant.BasicVariantGenerato
            r object at 0x7fc0000395d0>, num_samples=10,
            refit_with_val=False, cpus=4, gpus=0, verbose=False,
            alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0000395d0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoBiTCN.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoBiTCN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoBiTCN(h=12, config=None, backend='optuna')

B. MLP-Based


source

AutoMLP

 AutoMLP (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7fc00004e080>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00004e080>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoMLP.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoMLP(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoMLP(h=12, config=None, backend='optuna')

source

AutoNBEATS

 AutoNBEATS (h, loss=MAE(), valid_loss=None, config=None,
             search_alg=<ray.tune.search.basic_variant.BasicVariantGenerat
             or object at 0x7fc02a71fb20>, num_samples=10,
             refit_with_val=False, cpus=4, gpus=0, verbose=False,
             alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc02a71fb20>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNBEATS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12,
              mlp_units=3*[[8, 8]])
model = AutoNBEATS(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNBEATS(h=12, config=None, backend='optuna')

source

AutoNBEATSx

 AutoNBEATSx (h, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7fc000005e40>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000005e40>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNBEATSx.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12,
              mlp_units=3*[[8, 8]])
model = AutoNBEATSx(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNBEATSx(h=12, config=None, backend='optuna')

source

AutoNHITS

 AutoNHITS (h, loss=MAE(), valid_loss=None, config=None,
            search_alg=<ray.tune.search.basic_variant.BasicVariantGenerato
            r object at 0x7fc000014520>, num_samples=10,
            refit_with_val=False, cpus=4, gpus=0, verbose=False,
            alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000014520>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12, 
              mlp_units=3 * [[8, 8]])
model = AutoNHITS(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNHITS(h=12, config=None, backend='optuna')

source

AutoDLinear

 AutoDLinear (h, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7fc0001eea40>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0001eea40>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDLinear.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoDLinear(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDLinear(h=12, config=None, backend='optuna')

source

AutoNLinear

 AutoNLinear (h, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7fc0001c6620>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0001c6620>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNLinear.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoNLinear(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNLinear(h=12, config=None, backend='optuna')

source

AutoTiDE

 AutoTiDE (h, loss=MAE(), valid_loss=None, config=None,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7fc0003d5840>, num_samples=10,
           refit_with_val=False, cpus=4, gpus=0, verbose=False,
           alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0003d5840>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTiDE.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoTiDE(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTiDE(h=12, config=None, backend='optuna')

source

AutoDeepNPTS

 AutoDeepNPTS (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7fc0000074c0>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0000074c0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDeepNPTS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoDeepNPTS(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDeepNPTS(h=12, config=None, backend='optuna')

C. KAN-Based


source

AutoKAN

 AutoKAN (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7fc0003c6350>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0003c6350>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoKAN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoKAN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoKAN(h=12, config=None, backend='optuna')

D. Transformer-Based


source

AutoTFT

 AutoTFT (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7fc00039dd80>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00039dd80>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTFT.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoTFT(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTFT(h=12, config=None, backend='optuna')

source

AutoVanillaTransformer

 AutoVanillaTransformer (h, loss=MAE(), valid_loss=None, config=None,
                         search_alg=<ray.tune.search.basic_variant.BasicVa
                         riantGenerator object at 0x7fc00003bf70>,
                         num_samples=10, refit_with_val=False, cpus=4,
                         gpus=0, verbose=False, alias=None, backend='ray',
                         callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00003bf70>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoVanillaTransformer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoVanillaTransformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoVanillaTransformer(h=12, config=None, backend='optuna')

source

AutoInformer

 AutoInformer (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7fc0001b8040>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0001b8040>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoInformer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoInformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoInformer(h=12, config=None, backend='optuna')

source

AutoAutoformer

 AutoAutoformer (h, loss=MAE(), valid_loss=None, config=None,
                 search_alg=<ray.tune.search.basic_variant.BasicVariantGen
                 erator object at 0x7fc0001455d0>, num_samples=10,
                 refit_with_val=False, cpus=4, gpus=0, verbose=False,
                 alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0001455d0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoAutoformer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoAutoformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoAutoformer(h=12, config=None, backend='optuna')

source

AutoFEDformer

 AutoFEDformer (h, loss=MAE(), valid_loss=None, config=None,
                search_alg=<ray.tune.search.basic_variant.BasicVariantGene
                rator object at 0x7fc0003d4370>, num_samples=10,
                refit_with_val=False, cpus=4, gpus=0, verbose=False,
                alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0003d4370>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoFEDFormer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=64)
model = AutoFEDformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoFEDformer(h=12, config=None, backend='optuna')

source

AutoPatchTST

 AutoPatchTST (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7fc000194790>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000194790>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoPatchTST.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=16)
model = AutoPatchTST(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoPatchTST(h=12, config=None, backend='optuna')

source

AutoiTransformer

 AutoiTransformer (h, n_series, loss=MAE(), valid_loss=None, config=None,
                   search_alg=<ray.tune.search.basic_variant.BasicVariantG
                   enerator object at 0x7fc0001617e0>, num_samples=10,
                   refit_with_val=False, cpus=4, gpus=0, verbose=False,
                   alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0001617e0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoiTransformer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=16)
model = AutoiTransformer(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoiTransformer(h=12, n_series=1, config=None, backend='optuna')

E. CNN Based


source

AutoTimesNet

 AutoTimesNet (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7fc000184100>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000184100>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTimesNet.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=32)
model = AutoTimesNet(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTimesNet(h=12, config=None, backend='optuna')

F. Multivariate


source

AutoStemGNN

 AutoStemGNN (h, n_series, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7fc00016f850>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00016f850>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoStemGNN.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoStemGNN(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoStemGNN(h=12, n_series=1, config=None, backend='optuna')

source

AutoHINT

 AutoHINT (cls_model, h, loss, valid_loss, S, config,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7fc000f4d990>, num_samples=10, cpus=4, gpus=0,
           refit_with_val=False, verbose=False, alias=None, backend='ray',
           callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
cls_modelPyTorch/PyTorchLightning modelSee neuralforecast.models collection here.
hintForecast horizon
lossPyTorch moduleInstantiated train loss class from losses collection.
valid_lossPyTorch moduleInstantiated valid loss class from losses collection.
S
configdict or callableDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000f4d990>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
refit_with_valboolFalseRefit of best model should preserve val_size.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Perform a simple hyperparameter optimization with 
# NHITS and then reconcile with HINT
from neuralforecast.losses.pytorch import GMM, sCRPS

base_config = dict(max_steps=1, val_check_steps=1, input_size=8)
base_model = AutoNHITS(h=4, loss=GMM(n_components=2, quantiles=quantiles), 
                       config=base_config, num_samples=1, cpus=1)
model = HINT(h=4, S=S_df.values,
             model=base_model,  reconciliation='MinTraceOLS')

model.fit(dataset=dataset)
y_hat = model.predict(dataset=hint_dataset)

# Perform a conjunct hyperparameter optimization with 
# NHITS + HINT reconciliation configurations
nhits_config = {
       "learning_rate": tune.choice([1e-3]),                                     # Initial Learning rate
       "max_steps": tune.choice([1]),                                            # Number of SGD steps
       "val_check_steps": tune.choice([1]),                                      # Number of steps between validation
       "input_size": tune.choice([5 * 12]),                                      # input_size = multiplier * horizon
       "batch_size": tune.choice([7]),                                           # Number of series in windows
       "windows_batch_size": tune.choice([256]),                                 # Number of windows in batch
       "n_pool_kernel_size": tune.choice([[2, 2, 2], [16, 8, 1]]),               # MaxPool's Kernelsize
       "n_freq_downsample": tune.choice([[168, 24, 1], [24, 12, 1], [1, 1, 1]]), # Interpolation expressivity ratios
       "activation": tune.choice(['ReLU']),                                      # Type of non-linear activation
       "n_blocks":  tune.choice([[1, 1, 1]]),                                    # Blocks per each 3 stacks
       "mlp_units":  tune.choice([[[512, 512], [512, 512], [512, 512]]]),        # 2 512-Layers per block for each stack
       "interpolation_mode": tune.choice(['linear']),                            # Type of multi-step interpolation
       "random_seed": tune.randint(1, 10),
       "reconciliation": tune.choice(['BottomUp', 'MinTraceOLS', 'MinTraceWLS'])
    }
model = AutoHINT(h=4, S=S_df.values,
                 cls_model=NHITS,
                 config=nhits_config,
                 loss=GMM(n_components=2, level=[80, 90]),
                 valid_loss=sCRPS(level=[80, 90]),
                 num_samples=1, cpus=1)
model.fit(dataset=dataset)
y_hat = model.predict(dataset=hint_dataset)

source

AutoTSMixer

 AutoTSMixer (h, n_series, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7fc000f4c640>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000f4c640>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTSMixer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoTSMixer(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTSMixer(h=12, n_series=1, config=None, backend='optuna')

source

AutoTSMixerx

 AutoTSMixerx (h, n_series, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7fc000f4c310>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000f4c310>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTSMixerx.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoTSMixerx(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTSMixerx(h=12, n_series=1, config=None, backend='optuna')

source

AutoMLPMultivariate

 AutoMLPMultivariate (h, n_series, loss=MAE(), valid_loss=None,
                      config=None, search_alg=<ray.tune.search.basic_varia
                      nt.BasicVariantGenerator object at 0x7fc02a8259c0>,
                      num_samples=10, refit_with_val=False, cpus=4,
                      gpus=0, verbose=False, alias=None, backend='ray',
                      callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc02a8259c0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoMLPMultivariate.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoMLPMultivariate(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoMLPMultivariate(h=12, n_series=1, config=None, backend='optuna')

source

AutoSOFTS

 AutoSOFTS (h, n_series, loss=MAE(), valid_loss=None, config=None,
            search_alg=<ray.tune.search.basic_variant.BasicVariantGenerato
            r object at 0x7fc000134c40>, num_samples=10,
            refit_with_val=False, cpus=4, gpus=0, verbose=False,
            alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc000134c40>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoSOFTS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=16)
model = AutoSOFTS(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoSOFTS(h=12, n_series=1, config=None, backend='optuna')

source

AutoTimeMixer

 AutoTimeMixer (h, n_series, loss=MAE(), valid_loss=None, config=None,
                search_alg=<ray.tune.search.basic_variant.BasicVariantGene
                rator object at 0x7fc0001372b0>, num_samples=10,
                refit_with_val=False, cpus=4, gpus=0, verbose=False,
                alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc0001372b0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTimeMixer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, d_model=16)
model = AutoTimeMixer(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTimeMixer(h=12, n_series=1, config=None, backend='optuna')

source

AutoRMoK

 AutoRMoK (h, n_series, loss=MAE(), valid_loss=None, config=None,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7fc00025d750>, num_samples=10,
           refit_with_val=False, cpus=4, gpus=0, verbose=False,
           alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7fc00025d750>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoRMoK.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, learning_rate=1e-2)
model = AutoRMoK(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoRMoK(h=12, n_series=1, config=None, backend='optuna')

TESTS