The naive Bayes classifier

In this case, what is the probability Y belongs to a distinct class k given an observation x?πₖ is the prior probability that an observation comes from a distinct class kfₖ(x) is the likelihood (or density function) that the observation X comes from class k..Thus when fₖ(x) is large, there is a high probability that given observation x, we observe a distinct class k..The converse is true when fₖ(x) is small.πₖ fₖ(x) is the engine of the classifier..This calculates the posterior probability for each distinct class k and then assigns Y to class k with the highest posterior probability.Σ πₗ fₗ(x) is simply the likelihood that we observe x over all populations..This is equivalent to the normalization constant.Estimating the likelihood function is difficult so we make the naive assumption each feature is conditionally independent of every other feature..This allows us to go from:P(xᵢ | k, x₁,…..xₙ)↓P(xᵢ | k) which is estimated by fₖ(x)One of the perks of this assumption is that it allows the NB classifier to perform well on high-dimensional data since each P(xᵢ | k) can be calculated separately as a one-dimensional distribution.The different flavors of NB classifiers ????NB classifiers differ in the assumptions made regarding the distribution P(xᵢ|k).Gaussian NB assumes that the data from each observation is drawn from a Gaussian distribution.Multinomial NB assumes the features come from a multinomial distribution (Poisson distribution)There are several more adaptations of the NB classifier that you can check out in the scikit learn documentation here.How to use itImplementing a NB classifier is swift and easy (we are assuming we have Gaussian features in this example):from sklearn.naive_bayes import GaussianNB# fit the modelmodel = GaussianNB()model.fit(X_train, y_train)# make a predictiony_pred = model.predict(X_test)ConclusionNB classifiers are often good baseline models that you can build off of to explore more sophisticated models..They are extremely fast and easily interpretable with the benefit of performing especially well on datasets with high-dimensionality..Check out this paper for a more in-depth discussion for why NB classifiers work so well.Thanks for reading!.Stay tuned for more as I continue on my path to become a data scientist!.✌️. More details

Leave a Reply