Let’s go on! If you just arrived, you can check out the first part here. The goal of this series is to demonstrate how compactly we can implement an MLP in a functional programming paradigm and how easy it becomes to extend/play around with it.
Start of a small series The gif below is the evolution of the weights from a neural network trained on the mnist dataset. Mnist is a dataset of handwritten digits, and is kind of the hello world/FizzBuzz of machine learning.