The supervised learning algorithm Support Vector Regression is used to predict discrete values. SVMs and Support Vector Regression are both based on the same premise. SVR's main premise is to locate the best-fitting line. The best fit line in SVR is the hyperplane with the greatest number of points.
Support Vector Machine can also be utilised as a regression approach while retaining all of the algorithm's fundamental characteristics (maximal margin). With a few minor exceptions, the Support Vector Regression (SVR) uses the same principles as the SVM for classification.
One of the key advantages of SVR is that its computational complexity is independent of the input space's dimensionality. It also has a high prediction accuracy and great generalisation capabilities. The purpose of this chapter is to give an overview of SVR and Bayesian regression.
Learner's Ratings
4.4
Overall Rating
69%
10%
13%
5%
3%
Reviews
M
Muhammad Qasim
5
Hi Kushal ! Your way of teaching is extremely helpful and you are one of the best teacher in the world.
Extremely helpful and I recommend to my peer as well for this course.
S
Shafi Akhtar
5
None
A
Aniket Kumar prasad
5
Very helpful and easy to understand all the concepts, best teacher for learning ML.
R
Rishu Shrivastav
5
explained everything in detail. I have a question learnvern provide dataset , and ppt ? or not?
V
VIKAS CHOUBEY
5
very nicely explained
V
Vrushali Kandesar
5
Awesome and very nicely explained!!!
One importing thing to notify to team is by mistakenly navie's practical has been added under svm lecture and vice versa (Learning Practical 1)
M
Mohd Mushraf
5
Amazing Teaching
J
Juboraj Juboraj
5
Easy to understand & explain details.
J
Joydeb
5
Awesome Course sir and your teaching style is very GOOD.
S
Shaga Chandrakanth Goud
5
Hi Kushal ji, Thanks a lot for a very good explanation. I have doubts about where we can get the dataset that you explained in the video. Can you make it available in resource ,so that we can downld
Share a personalized message with your friends.