granite-timeseries-ttm-r2 is part of IBM’s TinyTimeMixers (TTM) series—compact, pre-trained models for multivariate time series forecasting. Unlike massive foundation models, TTM models are designed to be lightweight yet powerful, with only ~805K parameters, enabling high performance even on CPU or single-GPU machines. The r2 version is pre-trained on ~700M samples (r2.1 expands to ~1B), delivering up to 15% better accuracy than the r1 version. TTM supports both zero-shot and fine-tuned forecasting, handling minutely, hourly, daily, and weekly resolutions. It can integrate exogenous variables, static categorical features, and perform channel-mixing for richer multivariate forecasting. The get_model() utility makes it easy to auto-select the best TTM model for specific context and prediction lengths. These models significantly outperform benchmarks like Chronos, GPT4TS, and Moirai while demanding a fraction of the compute.
Features
- Lightweight models (~805K parameters) optimized for speed and portability
- Pre-trained on massive, diverse time series datasets (~700M–1B samples)
- Supports zero-shot and fine-tuned forecasting with minimal data
- Forecasts time points at minutely, hourly, daily, and weekly resolutions
- Enables integration of exogenous/control variables and static categorical features
- Includes get_model() utility for automatic model selection
- Offers channel-independent and channel-mixing fine-tuning modes
- Outperforms state-of-the-art models like Chronos and GPT4TS on benchmarks