mehran@mldawn.com

ECML-PKDD-2019: Elliptical Basis Function Data Descriptor (EBFDD) for Anomaly Detection

ECML-PKDD-2019 on EBFDD networks for Anomaly Detection This paper introduces the Elliptical Basis Function Data Descriptor (EBFDD) network, a one-class classification approach to anomaly detection based on Radial Basis Function (RBF) neural networks. The EBFDD network uses elliptical basis functions, which allows it to learn sophisticated decision boundaries while retaining the advantages of a shallow …

ECML-PKDD-2019: Elliptical Basis Function Data Descriptor (EBFDD) for Anomaly Detection Read More »

Train a Perceptron to Learn the AND Gate from Scratch in Python

What will you Learn in this Post? Neural Networks are function approximators. For example, in a supervised learning setting, given loads of inputs and loads of outputs in our training data, a neural network is capable of finding the hidden mapping between the those inputs and outputs. It can also, hopefully, generalize its acquired knowledge …

Train a Perceptron to Learn the AND Gate from Scratch in Python Read More »

Linear Regression from Scratch using Numpy

Hello friends and welcome to MLDawn! So what sometimes concerns me is the fact that magnificent packages such as PyTorch (indeed amazing), TensorFlow (indeed a nightmare), and Keras (indeed great), have prevented machine learning enthusiasts to really learn the science behind how the machine learns. To be more accurate, there is nothing wrong with having …

Linear Regression from Scratch using Numpy Read More »

Binary Classification from Scratch using Numpy

Hello friends and welcome to MLDawn! So what sometimes concerns me is the fact that magnificent packages such as PyTorch (indeed amazing), TensorFlow (indeed a nightmare), and Keras (indeed great), have prevented machine learning enthusiasts to really learn the science behind how the machine learns. To be more accurate, there is nothing wrong with having …

Binary Classification from Scratch using Numpy Read More »

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(2): Reduced Error Pruning

What is this post about? In the previous post, we learned all about what over-fitting is, and why over-fitting to our training data could easily hinder the generalization power of the final trained model to the future unseen test data. In particular, we said that there are 2 cases where over-fitting can harm us: If …

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(2): Reduced Error Pruning Read More »

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(1): What is over-fitting?

What is this post about, and what is next? Hi there!!! Let’s cut to the chase, shall we?! You have this decision tree algorithm, coded beautifully through Scikitlearn (I applaud you!), or from scratch using numpy (I am proud of you!). You let the bloody thing train, and train and you see the training error …

The Decision Tree Algorithm: Fighting Over-Fitting Issue – Part(1): What is over-fitting? Read More »