Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Modern Time Series Forecasting with Python

You're reading from   Modern Time Series Forecasting with Python Explore industry-ready time series forecasting using modern machine learning and deep learning

Arrow left icon
Product type Paperback
Published in Nov 2022
Publisher Packt
ISBN-13 9781803246802
Length 552 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
Manu Joseph Manu Joseph
Author Profile Icon Manu Joseph
Manu Joseph
Arrow right icon
View More author details
Toc

Table of Contents (26) Chapters Close

Preface 1. Part 1 – Getting Familiar with Time Series
2. Chapter 1: Introducing Time Series FREE CHAPTER 3. Chapter 2: Acquiring and Processing Time Series Data 4. Chapter 3: Analyzing and Visualizing Time Series Data 5. Chapter 4: Setting a Strong Baseline Forecast 6. Part 2 – Machine Learning for Time Series
7. Chapter 5: Time Series Forecasting as Regression 8. Chapter 6: Feature Engineering for Time Series Forecasting 9. Chapter 7: Target Transformations for Time Series Forecasting 10. Chapter 8: Forecasting Time Series with Machine Learning Models 11. Chapter 9: Ensembling and Stacking 12. Chapter 10: Global Forecasting Models 13. Part 3 – Deep Learning for Time Series
14. Chapter 11: Introduction to Deep Learning 15. Chapter 12: Building Blocks of Deep Learning for Time Series 16. Chapter 13: Common Modeling Patterns for Time Series 17. Chapter 14: Attention and Transformers for Time Series 18. Chapter 15: Strategies for Global Deep Learning Forecasting Models 19. Chapter 16: Specialized Deep Learning Architectures for Forecasting 20. Part 4 – Mechanics of Forecasting
21. Chapter 17: Multi-Step Forecasting 22. Chapter 18: Evaluating Forecasts – Forecast Metrics 23. Chapter 19: Evaluating Forecasts – Validation Strategies 24. Index 25. Other Books You May Enjoy

References

Following is the list of the references used in this chapter:

  1. Dzmitry Bahdanau, KyungHyun Cho, and Yoshua Bengio (2015). Neural Machine Translation by Jointly Learning to Align and Translate. In 3rd International Conference on Learning Representations. https://arxiv.org/pdf/1409.0473.pdf
  2. Thang Luong, Hieu Pham, and Christopher D. Manning (2015). Effective Approaches to Attention-based Neural Machine Translation. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. https://aclanthology.org/D15-1166/
  3. André F. T. Martins, Ramón Fernandez Astudillo (2016). From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label Classification. In Proceedings of the 33rd International Conference on Machine Learning. http://proceedings.mlr.press/v48/martins16.html
  4. Ben Peters, Vlad Niculae, André F. T. Martins (2019). Sparse Sequence-to-Sequence Models. In Proceedings of the 57th Annual Meeting of the Association...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image