*Multi-Layer Perceptron Class Parameters:
in_features
: int, dimension of input.out_features
: int, dimension of output.activation
: str,
activation function to use.hidden_size
: int, dimension of hidden
layers.num_layers
: int, number of hidden layers.dropout
:
float, dropout rate.*Chomp1d Receives
x
input of dim [N,C,T], and trims it so that only ‘time
available’ information is used. Used by one dimensional causal
convolutions CausalConv1d
.
Parameters:horizon
: int, length of outsample values to
skip.*
*Causal Convolution 1d Receives
x
input of dim [N,C_in,T], and computes a causal
convolution in the time dimension. Skipping the H steps of the forecast
horizon, through its dilation. Consider a batch of one element, the
dilated convolution operation on the time step is defined:
where is the dilation factor, is the kernel size, is the
index of the considered past observation. The dilation effectively
applies a filter with skip connections. If one recovers a normal
convolution.
Parameters:in_channels
: int, dimension of x
input’s initial
channels.out_channels
: int, dimension of x
outputs’s
channels.activation
: str, identifying activations from PyTorch
activations. select from ‘ReLU’,‘Softplus’,‘Tanh’,‘SELU’,
‘LeakyReLU’,‘PReLU’,‘Sigmoid’.padding
: int, number of zero
padding used to the left.kernel_size
: int, convolution’s kernel
size.dilation
: int, dilation skip connections.x
: tensor, torch tensor of dim [N,C_out,T]
activation(conv1d(inputs, kernel) + bias). *Temporal Convolution Encoder Receives
x
input of dim [N,T,C_in], permutes it to [N,C_in,T]
applies a deep stack of exponentially dilated causal convolutions. The
exponentially increasing dilations of the convolutions allow for the
creation of weighted averages of exponentially large long-term memory.
Parameters:in_channels
: int, dimension of x
input’s initial
channels.out_channels
: int, dimension of x
outputs’s
channels.kernel_size
: int, size of the convolving kernel.dilations
: int list, controls the temporal spacing between the kernel
points.activation
: str, identifying activations from PyTorch
activations. select from ‘ReLU’,‘Softplus’,‘Tanh’,‘SELU’,
‘LeakyReLU’,‘PReLU’,‘Sigmoid’.x
: tensor, torch tensor of dim [N,T,C_out].*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
TriangularCausalMask
DataEmbedding_inverted
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
*Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing them to be nested in a tree structure. You can assign the submodules as regular attributes::
to
, etc.
.. note:: As per the example above, an __init__()
call to the parent
class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or
evaluation mode. :vartype training: bool*
Series decomposition block
Moving average block to highlight the trend of time series
RevIN (Reversible-Instance-Normalization)
ReversibleInstanceNorm1d for Multivariate models