
k-means is one of the simplest unsupervised learning algorithms that solve the well-known clustering problem. The procedure follows a simple and easy way to classify a given data set through a certain number of clusters. The main idea is to define k centers, one for each cluster. These centers should be placed in a cunning way because of different location causes the different result. So, the better choice is to place them as much as possible far away from each other. The next step is to take each point belonging to a given data set and associate it with the nearest center. When no point is pending, the first step is completed and an early group age is done. At this point, we need to re-calculate k new centroids as barycenters of the clusters resulting from the previous step. After we have these k new centroids, a new binding has to be done between the same data set points and the nearest new center. A loop has been generated. As a result of this loop, we may notice that the k centers change their location step by step until no more changes are done or in other words, centers do not move anymore. Finally, this algorithm aims at minimizing an objective function known as the squared error function given by:


Advantages
- Fast, robust, and easier to understand.
- Relatively efficient: O(t*k*n*d), where n is # objects, k is # clusters, d is # dimension of each object, and t is # iterations. Normally, k, t, d << n.
- Gives the best result when the data set is distinct or well separated from each other.
Disadvantages
- The use of Exclusive Assignment – If there are two highly overlapping data then k-means will not be able to resolve that there are two clusters.
- The learning algorithm is not invariant to non-linear transformations i.e. with a different representation of data we get different results (data represented in form of cartesian co-ordinates and polar co-ordinates will give different results).
- Euclidean distance measures can unequally weigh underlying factors.
- The learning algorithm provides the local optima of the squared error function.
References
- https://towardsdatascience.com/k-means-clustering-algorithm-applications-evaluation-methods-and-drawbacks-aa03e644b48a#:~:text=Kmeans%20algorithm%20is%20an%20iterative,belongs%20to%20only%20one%20group.&text=The%20less%20variation%20we%20have,are%20within%20the%20same%20cluster.
- https://sites.google.com/site/dataclusteringalgorithms/k-means-clustering-algorithm
- https://en.wikipedia.org/wiki/K-means_clustering
- https://www.geeksforgeeks.org/k-means-clustering-introduction/
- https://www.analyticsvidhya.com/blog/2020/10/a-simple-explanation-of-k-means-clustering/
Thanks , I’ve just been searching for info approximately this subject for a long time and yours is the best I’ve came upon so far. But, what in regards to the bottom line? Are you certain in regards to the source?
Valuable info. Lucky me I found your website by accident, and I am shocked why this accident didn’t happened earlier! I bookmarked it.
Thanks a lot for sharing this with all of us you really know what you’re talking about! Bookmarked. Please also visit my website =). We could have a link exchange agreement between us!
Hi there! I could have sworn I’ve been to this blog before but after browsing through many of the articles I realized it’s new to
me. Anyways, I’m definitely delighted I found it and I’ll be book-marking it and checking back regularly!
Attractive section of content. I just stumbled upon your
web site and in accession capital to assert that I get actually
enjoyed account your blog posts. Anyway I will be subscribing to your augment and even I achievement you access consistently quickly.
After looking at a handful of the blog posts on your site, I truly appreciate your technique
of blogging. I added it to my bookmark webpage
list and will be checking back soon. Please check out my web site as well
and tell me your opinion.
Its not my first time to pay a quick visit this website, i am visiting this website dailly and get good facts from here daily.
Thanks for sharing your info. I truly appreciate your efforts and I will
be waiting for your further write ups thanks once
again.
Fantastic post however , I was wanting to know if you
could write a litte more on this topic? I’d be very
thankful if you could elaborate a little bit more. Many thanks!
Excellent blog! Do you have any suggestions for aspiring writers? I’m hoping to start my own site soon but I’m a little lost on everything. Would you advise starting with a free platform like WordPress or go for a paid option? There are so many options out there that I’m completely confused .. Any suggestions? Bless you!
Wow! This can be one particular of the most useful blogs We’ve ever arrive across on this subject. Basically Wonderful. I’m also an expert in this topic so I can understand your hard work.