api_key
and give it a try yourself!
ds | y | ||||||
---|---|---|---|---|---|---|---|
min | max | count | min | mean | median | max | |
unique_id | |||||||
FOODS_1 | 2011-01-29 | 2016-05-22 | 1941 | 0.0 | 2674.085523 | 2665.0 | 5493.0 |
FOODS_2 | 2011-01-29 | 2016-05-22 | 1941 | 0.0 | 4015.984029 | 3894.0 | 9069.0 |
FOODS_3 | 2011-01-29 | 2016-05-22 | 1941 | 10.0 | 16969.089129 | 16548.0 | 28663.0 |
HOBBIES_1 | 2011-01-29 | 2016-05-22 | 1941 | 0.0 | 2936.122617 | 2908.0 | 5009.0 |
HOBBIES_2 | 2011-01-29 | 2016-05-22 | 1941 | 0.0 | 279.053065 | 248.0 | 871.0 |
HOUSEHOLD_1 | 2011-01-29 | 2016-05-22 | 1941 | 0.0 | 6039.594539 | 5984.0 | 11106.0 |
HOUSEHOLD_2 | 2011-01-29 | 2016-05-22 | 1941 | 0.0 | 1566.840289 | 1520.0 | 2926.0 |
nixtla_client.forecast
to produce accurate, high-performance forecasts
tailored to your unique time series.
📘 Why Use TimeGPT over Classical Models?
- Complex Patterns: TimeGPT captures non-linear trends classical models miss.
- Minimal Preprocessing: TimeGPT requires little to no data preparation.
- Scalability: TimeGPT can efficiently scales across multiple series without retraining.
📘 Why Use TimeGPT over Machine Learning Models?
- Automatic Pattern Recognition: Captures complex patterns from raw data, bypassing the need for feature engineering.
- Minimal Tuning: Works well without extensive tuning.
- Scalability: Forecasts across multiple series without retraining.
📘 Why Use TimeGPT Over Deep Learning Models?
- Faster Setup: Quick setup and forecasting, unlike the lengthy configuration and training times of neural networks.
- Less Tuning: Performs well with minimal tuning and preprocessing, while neural networks often need extensive adjustments.
- Ease of Use: Simple deployment with high accuracy, making it accessible without deep technical expertise.
Model | RMSE | SMAPE |
---|---|---|
ARIMA | 724.9 | 5.50% |
LightGBM | 687.8 | 5.14% |
N-HiTS | 605.0 | 5.34% |
TimeGPT | 592.6 | 4.94% |
api_key
and get started today! Happy forecasting, and enjoy the
insights ahead!
Scenario | TimeGPT | Classical Models (e.g., ARIMA) | Machine Learning Models (e.g., XGB, LGBM) | Deep Learning Models (e.g., N-HITS) |
---|---|---|---|---|
Seasonal Patterns | ✅ Performs well with minimal setup | ✅ Handles seasonality with adjustments (e.g., SARIMA) | ✅ Performs well with feature engineering | ✅ Captures seasonal patterns effectively |
Non-Linear Patterns | ✅ Excels, especially with complex non-linear patterns | ❌ Limited performance | ❌ Struggles without extensive feature engineering | ✅ Performs well with non-linear relationships |
Large Dataset | ✅ Highly scalable across many series | ❌ Slow and resource-intensive | ✅ Scalable with optimized implementations | ❌ Requires significant resources for large datasets |
Small Dataset | ✅ Performs well; requires only one data point to start | ✅ Performs well; may struggle with very sparse data | ✅ Performs adequately if enough features are extracted | ❌ May need a minimum data size to learn effectively |
Preprocessing Required | ✅ Minimal preprocessing needed | ❌ Requires scaling, log-transform, etc., to meet model assumptions | ❌ Requires extensive feature engineering for complex patterns | ❌ Needs data normalization and preprocessing |
Accuracy Requirement | ✅ Achieves high accuracy with minimal tuning | ❌ May struggle with complex accuracy requirements | ✅ Can achieve good accuracy with tuning | ✅ High accuracy possible but with significant resource use |
Scalability | ✅ Highly scalable with minimal task-specific configuration | ❌ Not easily scalable | ✅ Moderate scalability, with feature engineering and tuning per task | ❌ Limited scalability due to resource demands |
Computational Resources | ✅ Highly efficient, operates seamlessly on CPU, no GPU needed | ✅ Light to moderate, scales poorly with large datasets | ❌ Moderate, depends on feature complexity | ❌ High resource consumption, often requires GPU |
Memory Requirement | ✅ Efficient memory usage for large datasets | ✅ Moderate memory requirements | ❌ High memory usage for larger datasets or many series cases | ❌ High memory consumption for larger datasets and multiple series |
Technical Requirements & Domain Knowledge | ✅ Low; minimal technical setup and no domain expertise needed | ✅ Low to moderate; needs understanding of stationarity | ❌ Moderate to high; requires feature engineering and tuning | ❌ High; complex architecture and tuning |