Chapter 13 – Visualizing Networks with TensorFlow 2.x and TensorBoard
- A CNN always has the same number of layers. (Yes | No)
No. A CNN does not have the same number of layers or even the same type of layers. The number of layers is part of the work to optimize an artificial neural network.
- ReLU is the best activation function. (Yes | No)
No. ReLU is an efficient activation function, but there are others such as leaky ReLU, softmax, sigmoid, and tanh.
- It is not necessary to compile a sequential classifier. (Yes | No)
No. The assertion should be yes – it is necessary.
- The output of a layer is best viewed without running a prediction. (Yes | No)
No. The output of a layer and a prediction are unrelated. The output of the layer can be the transformation of a layer (convolutional, pooling, dropout, flattening, other) or a prediction.
- The names of the layers mean nothing when viewing...