Dissecting Relu: A desceptively simple activation function
What is this post about? The Rectified Linear Unit (Relu) will be beautifully dissected in this popst: https://www.youtube.com/watch?v=Un9A90mfO54 This is what you will be able to generate and understand by the end of this post. This is the evolution of a shallow Artificial Neural Network (ANN) with relu() activation functions while training. The goal is …
Dissecting Relu: A desceptively simple activation function Read More »