# Algorithms

This is about supervised algorithms, theory, implementations and visualisations.

## Dissecting Relu: A desceptively simple activation function

What is this post about? This is what you will be able to generate and understand by the end of this post. This is the evolution of a shallow Artificial Neural Network (ANN) with relu() activation functions while training. The goal is to fit the black curve, which means that the ANN is a regressor! …

## Stochastic Approximation to Gradient Descent

What will you learn? The video below, delivers the main points of this blog post on Stochastic Gradient Descent (SGD): (GitHub code available in here)  https://youtu.be/gE9HzJ_GaRM In our previous 2 posts about gradient, in both post number 1 and post number 2, we did cover gradient descent in all its glory. We even went through …

## COVID-19 Analysis

Studying and Predicting the Progress of COVID-19 using Pandas and ARIMA COVID-19 has been around for nearly 4 months since the outbreak. In this notebook, we will study some of the useful statistics regarding number of confirmed/deaths/recovered cases as a function of time per each country/region. We will use the the dataset that has been …

## Reproducibility in Pytorch

What is Reproducibility All About? As a computer scientist, or as an academician, you do experiments with a bunch of algorithms. Let’s say you have coded a machine learning algorithm, like an Artificial Neural Network, and after doing your experiments with different datasets, you have found out that the best type of neural network has …

## Train a Perceptron to Learn the AND Gate from Scratch in Python

What will you Learn in this Post? Neural Networks are function approximators. For example, in a supervised learning setting, given loads of inputs and loads of outputs in our training data, a neural network is capable of finding the hidden mapping between the those inputs and outputs. It can also, hopefully, generalize its acquired knowledge …

## Linear Regression from Scratch using Numpy

Hello friends and welcome to MLDawn! So what sometimes concerns me is the fact that magnificent packages such as PyTorch (indeed amazing), TensorFlow (indeed a nightmare), and Keras (indeed great), have prevented machine learning enthusiasts to really learn the science behind how the machine learns. To be more accurate, there is nothing wrong with having …

## Binary Classification from Scratch using Numpy

Hello friends and welcome to MLDawn! So what sometimes concerns me is the fact that magnificent packages such as PyTorch (indeed amazing), TensorFlow (indeed a nightmare), and Keras (indeed great), have prevented machine learning enthusiasts to really learn the science behind how the machine learns. To be more accurate, there is nothing wrong with having …

## The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(2): Reduced Error Pruning

What is this post about? In the previous post, we learned all about what over-fitting is, and why over-fitting to our training data could easily hinder the generalization power of the final trained model to the future unseen test data. In particular, we said that there are 2 cases where over-fitting can harm us: If …

## The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(1): What is over-fitting?

What is this post about, and what is next? Hi there!!! Let’s cut to the chase, shall we?! You have this decision tree algorithm, coded beautifully through Scikitlearn (I applaud you!), or from scratch using numpy (I am proud of you!). You let the bloody thing train, and train and you see the training error …

## The Decision Tree Algorithm: Information Gain

Which attribute to choose? (Part-2) Today we are going to touch on something quite exciting. In our previous post we talked about Entropy of a set, E(S), and told you that entropy is nothing but a quantitative measure of how mixed up our set is! I also showed you that regardless of how the decision …