Gradient Descent is one of the basic iterative optimization algorithms used in Machine Learning, and its deep-rooted in linear algebra and math.
It is the first algorithm explained by Andrew Ng course of Machine Learning.
If you are like me, lost in the math when Andrew explained it, you will find this post useful.
We will write a python code that detect spam using Naive Bayes Classifier.
Naive Bayes Classifier is one of the well known classifiers in supervised learning. I am going to show how it is calculated from a training data.
Scrubbing a natural language text data is a widely used process that has well defined steps which you will find it in many places. From Lucene which is the Full text search engine that is used in Elastic Search and Azure Search, to any data science project that is processing Natural Language, including different ML projects, and general search projects.