Table of contents

TimeGPT

What is TimeGPT?

TimeGPT is the first foundation model for time series forecasting. It can produce accurate forecasts for new time series across a diverse array of domains using only historical values as inputs. The model “reads” time series data sequentially from left to right, similarly to how humans read a sentence. It looks at windows of past data, which we can think of as “tokens”, and then predicts what comes next. This prediction is based on patterns the model identifies and that it extrapolates into the future. Beyond forecasting, TimeGPT supports other time series related tasks, such as what-if-scenarios, anomaly detection, and more.

Is TimeGPT based on a Large Language Model (LLM)?

No, TimeGPT is not based on any large language model. While it follows the same principle of training a large transformer model on a vast dataset, its architecture is specifically designed to handle time series data and it has been trained to minimize forecasting errors.

How do I get started with TimeGPT?

To get started with TimeGPT, you need to register for an account here. You will receive an email asking you to confirm your signup. After confirming, you will be able to access your dashboard, which contains the details of your account.

How accessible is TimeGPT and what are the usage costs?

For a more in-depth understanding of TimeGPT, please refer to the research paper. While certain aspects of the model’s architecture remain confidential, registration for TimeGPT is open to all. New users receive $1,000 USD in free credits and subsequent usage fees are based on token consumption. For more details, please refer to the Pricing and Billing section

How can I use TimeGPT?

  • Through the Python SDK

  • Via the TimeGPT API. For instructions on how to call the API using different languages, please refer to the API documentation

Both methods require you to have a API key, which is obtained upon registration and can be found in your dashboard under API Keys.

TimeGPT API Key

What is an API key?

An API key is a unique string of characters that serves as a key to authenticate your requests when using the Nixtla SDK. It ensures that the person making the requests is authorized to do so.

Where can I get an API key?

Upon registration, you will receive an API key that can be found in your dashboard under API Keys. Remember that your API key is personal and should not be shared with anyone.

How do I use my API key?

To integrate your API key into your development workflow, please refer to the tutorial on Setting Up Your API Key.

How can I check the status of my API key?

If you want to check the status of your API key, you can use the validate_api_key method of the NixtlaClient class.

Features and Capabilities

What is the input to TimeGPT?

TimeGPT accepts pandas dataframes in long format with the following necessary columns:

  • ds (timestamp): timestamp in format YYYY-MM-DD or YYYY-MM-DD HH:MM:SS.
  • y (numeric): The target variable to forecast.

(Optionally, you can also pass a DataFrame without the ds column as long as it has DatetimeIndex)

TimeGPT also works with distributed dataframes like dask, spark and ray.

Can TimeGPT handle multiple time series?

Yes. For guidance on forecasting multiple time series at once, consult the Multiple Series tutorial.

Does TimeGPT support forecasting with exogenous variables?

Yes. For instructions on how to incorporate exogenous variables to TimeGPT, see the Exogenous Variables tutorial. For incorporating calendar dates specifically, you may find the Holidays and Special Dates tutorial useful. For categorical variables, refer to the Categorical Variables tutorial.

Can TimeGPT be used for anomaly detection?

Yes. To learn how to use TimeGPT for anomaly detection, refer to the Anomaly Detection tutorial.

Does TimeGPT support cross-validation?

Yes. To learn how to use TimeGPT for cross-validation, refer to the Cross-Validation tutorial.

Can TimeGPT be used to forecast historical data?

Yes. To find out how to forecast historical data using TimeGPT, see the Historical Forecast tutorial.

Can TimeGPT be used for uncertainty quantification?

Yes. For more information, explore the Prediction Intervals and Quantile Forecasts tutorials.

Can TimeGPT handle large datasets?

Yes. When dealing with large datasets that contain hundreds of thousands or millions of time series, we recommend using a distributed backend. TimeGPT is compatible with several distributed computing frameworks, including Spark, Ray, and Dask. Both the TimeGPT SDK and API don’t have a limit on the size of the dataset as long as a distributed backend is used.

Can TimeGPT be used with limited/short data?

TimeGPT supports any amount of data for generating point forecasts and is capable of producing results with just one observation per series. When using arguments such as level, finetune_steps, X_df (exogenous variables), or add_history, additional data points are necessary depending on the data frequency. For more details, please refer to the Data Requirements tutorial.

What is the maximum forecast horizon allowed by TimeGPT?

While TimeGPT does not have a maximum forecast horizon, its performance will decrease as the horizon increases. When the forecast horizon exceeds the season length of the data (for example, more than 12 months for monthly data), you will get this message: WARNING:nixtla.nixtla_client:The specified horizon "h" exceeds the model horizon. This may lead to less accurate forecasts. Please consider using a smaller horizon.

For details, refer to the tutorial on Long Horizon in Time Series.

Can TimeGPT handle missing values?

TimeGPT cannot handle missing values or series with irregular timestamps. For more information, see the Forecasting Time Series with Irregular Timestamps and the Dealing with Missing Values tutorial.

How can I plot the TimeGPT forecast?

The NixtlaClient class has a plot method that can be used to visualize the forecast. This method only works in interactive environments such as Jupyter notebooks and it doesn’t work on Python scripts.

Does TimeGPT support polars?

As of now, TimeGPT does not offer support for polars.

Does TimeGPT produce stable predictions?

TimeGPT is engineered for stability, ensuring consistent results for identical input data. This means that given the same dataset, the model will produce the same forecasts.

Fine-tuning

What is fine-tuning?

TimeGPT was trained on the largest publicly available time series dataset, covering a wide range of domains such as finance, retail, healthcare, and more. This comprehensive training enables TimeGPT to produce accurate forecasts for new time series without additional training, a capability known as zero-shot learning.

While the zero-shot model provides a solid baseline, the performance of TimeGPT can often be improved through fine-tuning. During this process, the TimeGPT model undergoes additional training using your specific dataset, starting from the pre-trained paramaters. The updated model then produces the forecasts. You can control the number of training iterations and the loss function for fine-tuning with the finetune_steps and the finetune_loss parameters in the forecast method from the NixtlaClient class, respectively.

For a comprehensive guide on how to apply fine-tuning, please refer to the fine-tuning and the fine-tuning with a specific loss function tutorials.

Do I have to fine-tune every series?

No, you do not need to fine-tune every series individually. When using the finetune_steps parameter, the model undergoes fine-tuning across all series in your dataset simultaneously. This method uses a cross-learning approach, allowing the model to learn from multiple series at once, which can improve individual forecasts.

Keep in mind that selecting the right number of fine-tuning steps may require some trial and error. As the number of fine-tuning steps increases, the model becomes more specialized to your dataset, but will take longer to train and may be more prone to overfitting.

Can I save fine-tuned parameters?

Currently, it is not possible to save the fine-tuned parameters for later use. This means you will need to perform fine-tuning each time you submit your data, whether using the Python SDK or the API.

Pricing and Billing

How does pricing work?

TimeGPT’s cost is based on usage. Every call you make has a certain number of input, output, and finetune tokens associated with it. Price tiers are based on the number of tokens used, which you can find in your dashboard under Billing. The cost per token decreases as you move up tiers, and you will pay the corresponding price for each segment of tokens used.

Are there free credits or discounts?

Upon signing up, all users receive $1000 USD in free credits. Once these credits are exhausted, you will be asked to add a payment method to continue using TimeGPT. Costs are then calculated based on the number of tokens used and the applicable price tiers.

The pricing model is designed for production settings where pipelines have already been tested and established. However, if you need additional free credits for testing, or if you are using TimeGPT for academic purposes, please contact us at ops@nixtla.io to ask about eligibility for additional free credits or discounts.

Free credits do not expire and can be used at any time.

How are tokens counted?

One token corresponds to one row of your dataset. Please refer our tutorials to understand the data format required for each use case. For high-frequency data, we recommend trimming the input whenever possible. For example, when working with minute-level data, consider trimming to the most recent hour to reduce costs.

In the Usage section of your dashboard, you can find a detailed record of your token consumption history, helping you track and manage your usage effectively.

How does billing work?

Billing is done within the first five days of each month. We have partnered with Stripe to handle all payment processing. You can view your invoices and payment history in your dashboard under Billing.

Privacy and Security

How do you ensure the privacy and security of my data?

At Nixtla, we take your privacy and security very seriously. To ensure you are fully informed about our policies regarding your data, please refer to the following documents:

In addtion, we are currently developing a self-hosted version of TimeGPT, tailored for the unique security requirements of enterprise data. This version is currently in beta. If you are interested in exploring this option, please contact us at ops@nixtla.io.

Troubleshooting

The following section contains some common errors and warnings

ApiError: status_code: 401, body: {'data': None, 'message': 'Invalid API key', 'details': 'Key not found', 'code': 'A12', 'requestID': 'E7F2BBTB2P', 'support': 'If you have questions or need support, please email ops@nixtla.io'}

Solution: This error occurs when your TimeGPT API key is either invalid or has not been set up correctly. Please use the validate_api_key method to verify it or make sure it was copied correctly from the API Keys section of your dashboard.

ApiError: status_code: 429, body: {'data': None, 'message': 'Too many requests', 'details': 'You need to add a payment method to continue using the API, do so from https://dashboard.nixtla.io', 'code': 'A21', 'requestID': 'NCJDK7KSJ6', 'support': 'If you have questions or need support, please email ops@nixtla.io'}

Solution: This error occurs when you have exhausted your free credits and need to add a payment method to continue using TimeGPT. You can add a payment method in the Billing section of your dashboard.

Solution: If you encounter a WriteTimeout error, it means that the request has exceeded the allowable processing time. This is a common issue when working with large datasets. To fix this, consider increasing the num_partitions parameter in the forecast method of the NixtlaClient class, or use a distributed backend if not already in use.

Additional Support

If you have any more questions or need support, please reach out by:

  • Opening an issue on GitHub for technical questions or bugs.
  • Sending an email to ops@nixtla.io for general inquiries or support.
  • Joining our Slack community to connect with our team and the forecasting community.