Algorithms

This is about supervised algorithms, theory, implementations and visualisations.

Dissecting Relu: A desceptively simple activation function

What is this post about? The Rectified Linear Unit (Relu) will be beautifully dissected in this popst: https://www.youtube.com/watch?v=Un9A90mfO54 This is what you will be able to generate and understand by the end of this post. This is the evolution of a shallow Artificial Neural Network (ANN) with relu() activation functions while training. The goal is …

Dissecting Relu: A desceptively simple activation function Read More »

Stochastic Approximation to Gradient Descent

What will you learn? The video below, delivers the main points of this blog post on Stochastic Gradient Descent (SGD): (GitHub code available in here)  https://youtu.be/gE9HzJ_GaRM In our previous 2 posts about gradient, in both post number 1 and post number 2, we did cover gradient descent in all its glory. We even went through …

Stochastic Approximation to Gradient Descent Read More »

COVID-19 Analysis

Studying and Predicting the Progress of COVID-19 using Pandas and ARIMA COVID-19 has been around for nearly 4 months since the outbreak. In this notebook, we will study some of the useful statistics regarding number of confirmed/deaths/recovered cases as a function of time per each country/region. We will use the the dataset that has been …

COVID-19 Analysis Read More »

Train a Perceptron to Learn the AND Gate from Scratch in Python

What will you Learn in this Post? Neural Networks are function approximators. For example, in a supervised learning setting, given loads of inputs and loads of outputs in our training data, a neural network is capable of finding the hidden mapping between the those inputs and outputs. It can also, hopefully, generalize its acquired knowledge …

Train a Perceptron to Learn the AND Gate from Scratch in Python Read More »

Linear Regression from Scratch using Numpy

Hello friends and welcome to MLDawn! So what sometimes concerns me is the fact that magnificent packages such as PyTorch (indeed amazing), TensorFlow (indeed a nightmare), and Keras (indeed great), have prevented machine learning enthusiasts to really learn the science behind how the machine learns. To be more accurate, there is nothing wrong with having …

Linear Regression from Scratch using Numpy Read More »

Binary Classification from Scratch using Numpy

Hello friends and welcome to MLDawn! So what sometimes concerns me is the fact that magnificent packages such as PyTorch (indeed amazing), TensorFlow (indeed a nightmare), and Keras (indeed great), have prevented machine learning enthusiasts to really learn the science behind how the machine learns. To be more accurate, there is nothing wrong with having …

Binary Classification from Scratch using Numpy Read More »

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(2): Reduced Error Pruning

What is this post about? In the previous post, we learned all about what over-fitting is, and why over-fitting to our training data could easily hinder the generalization power of the final trained model to the future unseen test data. In particular, we said that there are 2 cases where over-fitting can harm us: If …

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(2): Reduced Error Pruning Read More »

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(1): What is over-fitting?

What is this post about, and what is next? Hi there!!! Let’s cut to the chase, shall we?! You have this decision tree algorithm, coded beautifully through Scikitlearn (I applaud you!), or from scratch using numpy (I am proud of you!). You let the bloody thing train, and train and you see the training error …

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(1): What is over-fitting? Read More »