tl;dr emplace_back is often mistaken as a faster push_back, while it is in fact just a different tool. Do not blindly replace push_back by emplace_back, be careful of how you use emplace_back, since it can have unexpected consequences.
Let’s go on! If you just arrived, you can check out the first part here. The goal of this series is to demonstrate how compactly we can implement an MLP in a functional programming paradigm and how easy it becomes to extend/play around with it.
Start of a small series The gif below is the evolution of the weights from a neural network trained on the mnist dataset. Mnist is a dataset of handwritten digits, and is kind of the hello world/FizzBuzz of machine learning.
What is a strange attractor? The wikipedia article on attractors gives the following definition/explanation:
An attractor is called strange if it has a fractal structure. This is often the case when the dynamics on it are chaotic, but strange nonchaotic attractors also exist.
I have been waiting to read these for a while now!
Let the reading commence #rstats #MachineLearning pic.twitter.com/Lqx2IcTENI
— Gudmundur Einarsson (@gumgumeo) November 8, 2017