The Library of Babel greatly improved my intuitive understanding of how neural networks can learn.
The Library of Babel is a library with every book imaginable in it. There lay the books with all possible combinations of words and thus all possible combinations of sentences. Therefore, it contains books with the answers to all of life, books with all theories that mankind hasn't found yet, but also books with a lot of gibberish. If you want to find an answer to your question in the Library of Babel, you probably will never find it by just randomly looking. You need a smart search algorithm that can find the right page with the answer to your question.
There is a direct parallel between neural networks and the Library of Babel. Neural networks are universal function approximators, meaning that they could approximate any function imaginable, be it a function with the answers to life or gibberish functions. Just like with the library of Babel, you need a smart search algorithm, this time not to find the right page, but to find the right neural network configuration.
The learning problem is thus actually just a search problem: Gradient descent and backpropagation are searching algorithms, while the reward function defines what we are searching for.
I found this way of thinking about NN very enlightening, definitely helping me to understand learning more intuitively. I made a more elaborate post on this just now!
[link] [comments]