This tutorial is the continuation of our previous session.
So let's see ahead
Now we are going to learn about support vector machine SVM. This is an important concept but sometimes it also looks very complex
In today's session we will learn this in the easiest and simplest way.
So, support vector machines, each and every word is important, so let us understand what they are.
Support vector machines also come under supervised learning algorithms.
And with the help of this we can do classification along with that regression as well as we can also find out the outliers that are the odd man out.
And it also has one important speciality or capability that it solves both linear and nonlinear problems.
So it can deal with linearly separable variables and can also deal with non linear problems.
Ok so let us move ahead.
Now in this diagram a lot of concepts of SVM would be cleared over here.
In this you can see that we have two types of data items.
So the first type of data type is this circular green in colour, and the second type is this square shape red colour.
So these are the two 2 classes and now how SVM will work upon it?.
So to begin working on this as SVM will create a boundary between them.
So this line over here is the boundary and we can consider this side over here as positive and the downside as negative.
So can we separate these positive and negative classes easily, or if any new data point will come, with the help of this line can we be able to classify it between these positive and negative clases.
So, we can easily see that, we can very easily do this.
Now, real time situations can be complex but because we are understanding, we should keep things simple and clear.
So, we can see this line in between which is known as the hyper plane also, is easily able to separate this green circle and red square.
So, now we will see that, if we draw a line parallel to this hyperplane line then which data point will be the nearest and that will touch this line.
So you can see that this Red square here is touching the line and this green circle over here, from this side is touching the line.
So these are known as support vectors.
So this Red square and green circle is the nearest and if we take it as perpendicular then, it is having the least distance between them.
So the line that has been drawn with the help of these support vectors is called as positive hyperplane and negative hyperplane.
So this is the positive hyperplane and this is the negative hyperplane.
Meaning this line that is created is on the positive side of the hyperplane and this over here is created on the negative side of the hyperplane.
This is what is called decision boundaries.
So we created a boundary for decision making, and we try that in this decision boundary we get a maximum margin, because you know that if we take a maximum margin then, we will be able to make more correct classification.
That means the bigger the boundary and the distance in between will be, then that much the correctness will increase to predict correct classes.
So, here this distance is called the maximum margin.
So, once again this central line is basically the hyperplane, this itself classifies as to which will go in red square class or green circular class.
And these Red square and green circle data points are called support vectors, and through them only this positive and negative hyperplane passes, and we call them as decision boundaries.
And we want to keep the distance in between the decision margin as maximum, so we call it the maximum margin.
So, let's move ahead.
So, hyperplane, they only help in creating decision boundaries and thereafter help in classification and then try to get maximum margin.
So, sometimes what happens is that due to small margins there is an increase in chances to get wrong classifications.
So, this is the reason to get a maximum margin hyperplane.
Now, let's see its application.
Those are face detection, text and hypertext categorisation, bioinformatics.
So, here SVM is very widely used.
So, friends let us conclude here for today.
Today's session will end here.
And we will continue its further parts in the next session.
Till then keep learning and remain motivated.
If you have any queries or comments, click the discussion button below the video and post there. This way, you will be able to connect to fellow learners and discuss the course. Also, Our Team will try to solve your query.
good learning but the content titles are jumbled up, like first title of this module is decision tree dichotomiser which is practical part ahead of theory part. Same with the SVM practical 1 title has
Isakki Alias Devi P
yes, i am happy to learning for machine learning in LearnVern.it i s easily understanding for Beginners.
Superb and amazing 😍🤩 enjoyable experience.
Muhammad Nazam Maqbool
Absolutely good course... will suggest it to everyone. has superb content that is covered in a fantastic way.
super course and easily understanding and Good explaned
Ruturaj Nivas Patil
Very well explained in entire course. Great course for everyone as it takes from scratch to advance level.