Agglomerative Hierarchical Clustering (AHC) is a straightforward iterative classification approach. Calculating the dissimilarity between the N objects is the first step in the process. The two things or classes of objects whose clustering reduces the agglomeration criterion are subsequently grouped together.
The divisive clustering algorithm is a top-down clustering approach in which all points in the dataset are initially assigned to one cluster and then split iteratively as one progresses down the hierarchy.
Benefits: The agglomerative technique is simple to use. It can generate an object ordering that may be useful for the display. There is no need to specify the number of clusters in agglomerative Clustering.
It depends on your definition of noise and outliers. Because the jagged edges (data points) of noisy data are close together, the single linkage is sensitive to any of these points.
Learner's Ratings
4.3
Overall Rating
68%
11%
12%
5%
4%
Reviews
A
Ayush Bharti
4
how can i download the finaldata.csv?
J
Jagannath Mahato
5
Hello Kushal Sir!
Your way of teaching is very good. I thank you from my heart ❤️ that you are providing such good content for free.
M
Muhammad Qasim
5
Hi Kushal ! Your way of teaching is extremely helpful and you are one of the best teacher in the world.
Extremely helpful and I recommend to my peer as well for this course.
S
Shafi Akhtar
5
None
A
Aniket Kumar prasad
5
Very helpful and easy to understand all the concepts, best teacher for learning ML.
R
Rishu Shrivastav
5
explained everything in detail. I have a question learnvern provide dataset , and ppt ? or not?
V
VIKAS CHOUBEY
5
very nicely explained
V
Vrushali Kandesar
5
Awesome and very nicely explained!!!
One importing thing to notify to team is by mistakenly navie's practical has been added under svm lecture and vice versa (Learning Practical 1)
Share a personalized message with your friends.