Agglomerative Hierarchical Clustering (AHC) is a straightforward iterative classification approach. Calculating the dissimilarity between the N objects is the first step in the process. The two things or classes of objects whose clustering reduces the agglomeration criterion are subsequently grouped together.
The divisive clustering algorithm is a top-down clustering approach in which all points in the dataset are initially assigned to one cluster and then split iteratively as one progresses down the hierarchy.
Benefits: The agglomerative technique is simple to use. It can generate an object ordering that may be useful for the display. There is no need to specify the number of clusters in agglomerative Clustering.
It depends on your definition of noise and outliers. Because the jagged edges (data points) of noisy data are close together, the single linkage is sensitive to any of these points.
Learner's Ratings
4.4
Overall Rating
70%
11%
11%
5%
3%
Reviews
R
Rishu Shrivastav
5
explained everything in detail. I have a question learnvern provide dataset , and ppt ? or not?
V
VIKAS CHOUBEY
5
very nicely explained
V
Vrushali Kandesar
5
Awesome and very nicely explained!!!
One importing thing to notify to team is by mistakenly navie's practical has been added under svm lecture and vice versa (Learning Practical 1)
M
Mohd Mushraf
5
Amazing Teaching
J
Juboraj Juboraj
5
Easy to understand & explain details.
J
Joydeb
5
Awesome Course sir and your teaching style is very GOOD.
S
Shaga Chandrakanth Goud
5
Hi Kushal ji, Thanks a lot for a very good explanation. I have doubts about where we can get the dataset that you explained in the video. Can you make it available in resource ,so that we can downld
N
Neel Khairnar
5
Kushal is very good explainer he is covering all topics nicely 👍
Share a personalized message with your friends.