Time Series, Sequences, and Prediction with TensorFlow
Welcome to the final chapter in our journey with TensorFlow. In the last chapter, we closed on a high by applying neural networks such as DNNs to forecast time series data effectively. In this chapter, we will be exploring an array of advanced ideas, such as integrating learning rate schedulers into our workflow to dynamically adapt our learning rate and accelerate the process of model training. In previous chapters, we emphasized the need for and importance of finding the optimal learning rate. When building models with learning rate schedulers, we can achieve this in a dynamic way either using inbuilt learning rate schedulers in TensorFlow or by crafting our own custom-made learning rate scheduler.
Next, we will discuss Lambda layers and how these arbitrary layers can be applied in our model architecture to enhance quick experimentation, enabling us to embed custom functions seamlessly into our model’s architecture...