AIModels.fyi

AIModels.fyi

Share this post

AIModels.fyi
AIModels.fyi
Tsinghua University: Inverting Transformers Significantly Improves Time Series Forecasting

Tsinghua University: Inverting Transformers Significantly Improves Time Series Forecasting

Inverting Transformer architecture for time series forecasting improves performance

aimodels-fyi's avatar
aimodels-fyi
Oct 11, 2023
∙ Paid

Share this post

AIModels.fyi
AIModels.fyi
Tsinghua University: Inverting Transformers Significantly Improves Time Series Forecasting
Share
Tsinghua University: Inverting Transformers Significantly Improves Time Series Forecasting
"Comparison between the vanilla Transformer (top) and the proposed iTransformer (bottom). Unlike Transformer, which embeds each time step to the temporal token, iTransformer embeds the whole series independently to the variate token, such that multivariate correlations can be depicted by the attention mechanism and series representations are encoded by …

Keep reading with a 7-day free trial

Subscribe to AIModels.fyi to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 AIModels.fyi
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share