These are some key concepts related to time series forecasting, designed to help you better understand and leverage the capabilities of TimeGPT.
- Time Series
- Forecasting
- Foundation Model
- TimeGPT
- Tokens
- Fine-tuning
- Historical Forecasts
- Anomaly Detection
- Time Series Cross-Validation
- Exogenous Variables
Time Series
A time series is a sequence of data points indexed by time, used to model phenomena that changes over time, such as stock prices, temperature, or product sales. A time series can generally be thought of as comprising the following components:- Trend: The consistent, long-term direction of the data, whether upward or downward. It reflects the persistent, overall movement in the series over time.
- Seasonality: A repeated cycle around a known and fixed period.
- Remainder: The residuals or random noise left in the data after the trend and seasonal effects have been accounted for.
Forecasting
Forecasting is the process of predicting the future values of a time series based on historical data. It plays a crucial role in the decision-making process across various fields such as finance, healthcare, retail, and economics, among others. Forecasting can use a variety of approaches, from statistical approaches to novel techniques such as machine learning, deep learning, and foundation models. These models can be further classified into univariate and multivariate models, depending on the number of variables used to make the predictions, or local or global models, with local models estimating parameters independently for each series and global models estimating parameters jointly across multiple series. Forecasts themselves can be presented as point forecasts, which predict a single future value, or as probabilistic forecasts, which provide a full probability distribution of future values, and hence, providing a measure of uncertainty.Foundation Model
Foundation model refers to a type of large, pre-trained model that can be adapted to a wide range of tasks, including time series forecasting. Originally developed for domains such as natural language processing and computer vision, foundation models are now increasingly applied to sequential data like time series. These models are typically trained on extensive datasets, capturing complex patterns and dependencies that can be fine-tuned for specific tasks.TimeGPT
Developed by Nixtla,TimeGPT
is the first foundation model for time
series forecasting. TimeGPT
was trained on billions of observations
from publicly available datasets across multiple domains and can produce
accurate forecasts for new time series without additional training,
using only historical values as inputs. The model ‘reads’ time series
data similarly to how humans read a sentence—sequentially from left to
right. It looks at windows of past data, which we can think of as
‘tokens’, and predicts what comes next. This prediction is based on
patterns the model identifies in past data and extrapolates into the
future.
Tokens
TimeGPT
processes time series data in chunks. Each data point in a
series can be thought of as a ‘token’, akin to how individual words or
characters are treated in natural language processing (NLP).
Fine-tuning
Fine-tuning is a process used in machine learning where a pre-trained model likeTimeGPT
undergoes additional training to adapt it for a
specific dataset. Initially, TimeGPT
can operate in a zero-shot
manner, meaning it can generate forecasts as-is. While this zero-shot
approach provides a solid baseline, the performance of TimeGPT
can
often be improved through fine-tuning. During this process, the
TimeGPT
model undergoes additional training using the specific
dataset, starting from the pre-trained parameters. The updated model
then produces the forecasts.
Learn how to fine-tune
TimeGPT