So, in our Machine Learning Course previous tutorial we studied about Support Vector Machine, where we make use of SVM and create hyperplanes and with the help of that hyperplane we can do classification.
So, let's study our next algorithm which is Naive Bayes Algorithm.
This algorithm also is a very popular and widely spreaded classifier.
Now, I will tell you What actually this is?
Naive Bayes (naa-ee-v) is basically based on Bayes theorem, now what is this theorem I will tell you later, for now you should understand that this algorithm works on probabilistic approach, meaning it works on probability.
So, next we should understand that in probability there is a concept called conditional probability, you must have come across such situations like I want to go for a movie, but it is raining, now what is the probability that you will go for a movie?
Assuming that it is raining, it's fulfilling one condition already, the event is happening, then what are the chances of the second condition to happen?
So, this depends on the first event also.
This is called as conditional probability
So, here you can see "the probability of one event occurring with some relationship to one or more events".
So, here I told you about that, going to a movie depends upon the rain, if you are going on a bike, it's dependent on one thing.
So, in the same way you can take some more examples.
Will you be able to score more than 90 percent in the exams? that depends upon more than one factor, for instance how much did you study?, So you must have completed your study before only, so this will become one factor,
How is the exam paper set, is it very easy or very strict and difficult?
So, this way we can have two factors.
So, in this way, the happening of one event is dependent upon one or two more events that has already been done,
So, this is known as conditional probability.
Let's understand through one more example, which also is related to the example that I just gave.
So, one event is that it is raining outside and the other event is that you want to go outside.
So, here what is the probability for going outside?
So, if independently one is 30 and the other is 50, then conditional probability will look at both the things as to its raining and in that rain you have to go outside, on this it will try to find it's probability.
Now, there is a formula for this,
“Probability of B such that A has already occurred is equal to probability of A and B, (meaning the probability of both the things), divided by probability of A.”
Similarly, if you have to rewrite this same formula, as “probability of B such that A has already occurred,” you can write this in this way also, “that is equal to probability of A intersection B,” so the and over here you can represent that in the form of intersection also, “divided by probability of A”.
So, this is the formula to remove Conditional probability.
And from this only we derived the Bayes theorem,
So, what does Bayes theorem tell us?
So, here as we have “probability of B by A”, so what will be its opposite? Probability of A by B, so in Bayes theorem by using both, probability of A by B and probability of B by A, by equating both the formulas we derive Bayes theorem.
So, our Bayes theorem forms as; “probability of A by B, is equal to, probability of B by A into probability of A, divided by probability of B”.
So, if you have these three given, then you can easily calculate the probability of A by B also.
So, this is the formula of Bayes theorem.
With this Naive Bayes Classifier is also formed, where we have multiple features included.
So, this way we understood about Bayes theorem.
And here, I'll give a description for each of them.
So probability of A by B; "so how often A happens given that B happens,"
In the same way,
Probability of B by A; "so how often B happens given that A happens," meaning A is already done, so whenever A happens what is the probability of B also will happen.
Then “probability of A and probability of B”, this you already know, as to “how likely A is about to happen and how likely B is about to happen”.
So, friends, let's conclude today, we will stop this session here.
And for its next part we will cover it in our upcoming session.
Till then keep learning and remain motivated.
If you have any queries or comments, click the discussion button below the video and post there. This way, you will be able to connect to fellow learners and discuss the course. Also, Our Team will try to solve your query.
good learning but the content titles are jumbled up, like first title of this module is decision tree dichotomiser which is practical part ahead of theory part. Same with the SVM practical 1 title has
Isakki Alias Devi P
yes, i am happy to learning for machine learning in LearnVern.it i s easily understanding for Beginners.
Superb and amazing 😍🤩 enjoyable experience.
Muhammad Nazam Maqbool
Absolutely good course... will suggest it to everyone. has superb content that is covered in a fantastic way.
super course and easily understanding and Good explaned
Ruturaj Nivas Patil
Very well explained in entire course. Great course for everyone as it takes from scratch to advance level.