Here you can find my publications and blog posts about AI, machine learning, and related topics.
Recent Publications
Publications & Blogs
How to Apply Transformers to Time Series Models
This article explores how to adapt transformer architectures for time series forecasting, addressing the unique challenges of applying these models to sequential temporal data and introducing solutions like Informer and Spacetimeformer.
Key Insights
- Quadratic Complexity Challenge: Traditional transformers face computational bottlenecks in time series due to quadratic growth in attention calculation with sequence length.
- Network Modifications: Two critical improvements - learnable positional encoding for temporal patterns and ProbSparse attention to reduce computational complexity.
- Practical Solutions: Open-source models like Informer and Spacetimeformer show improved performance over LSTM, especially for long-term predictions with real-world applications.