Building an FNN from scratch
Let's perform a mind experiment. Imagine we are in 1969. We have today's knowledge but nothing to prove it. We know that a perceptron cannot implement the exclusive OR function XOR.
We have an advantage because we now know a solution exists. To start our experiment, we only have a pad, a pencil, a sharpener, and an eraser waiting for us. We're ready to solve the XOR problem from scratch on paper before programming it. We have to find a way to classify those dots with a neural network.
Step 1 – defining an FNN
We have to be unconventional to solve this problem. We must forget the complicated words and theories of the twenty-first century.
We can write a neural network layer in high-school format. A hidden layer will be:
h1 = x * w
OK. Now we have one layer. A layer is merely a function. This function can be expressed as:
f(x, w)
In which x is the input value, and w is some value to multiply...