Backpropagation
Now that we have computed the derivative of the activation functions, we can describe the backpropagation algorithm – the mathematical core of deep learning. Sometimes, backpropagation is called backprop for short.
Remember that a neural network can have multiple hidden layers, as well as one input layer and one output layer.
In addition to that, recall from Chapter 1, Neural Network Foundations with TensorFlow 2.0, that backpropagation can be described as a way of progressively correcting mistakes as soon as they are detected. In order to reduce the errors made by a neural network, we must train the network. The training needs a dataset including input values and the corresponding true output value. We want to use the network for predicting the output as close as possible to the true output value. The key intuition of the backpropagation algorithm is to update the weights of the connections based on the measured error at the output neuron(s). In the remainder...