A comparative study of several cluster number selection criteria
Publication in refereed journal


Full Text

Times Cited

Other information
AbstractThe selection of the number of clusters is an important and challenging issue in cluster analysis. In this paper we perform an experimental comparison of several criteria for determining the number of clusters based on Gaussian mixture model. The criteria that we consider include Akaike's information criterion (AIC), the consistent Akaike's information criterion (CAIC), the minimum description length (MDL) criterion which formally coincides with the Bayesian inference criterion (BIG), and two model selection methods driven from Bayesian Ying-Yang (BYY) harmony learning: harmony empirical learning criterion (BYY-HEC) and harmony data smoothing criterion (BYY-HDS). We investigate these methods on synthetic data sets of different sample size and the iris data set. The results of experiments illustrate that BYY-HDS has the best overall success rate and obviously outperforms other methods for small sample size. CAIC and MDL tend to underestimate the number of clusters, while AIC and BYY-HEC tend to overestimate the number of clusters especially in the case of small sample size. © Springer-Verlag 2003.
All Author(s) ListHu X., Xu L.
Journal nameLecture Notes in Artificial Intelligence
Detailed descriptionorganized by Springer-Verlag,
Year2004
Month12
Day1
Volume Number2690
PublisherSpringer Verlag
Place of PublicationGermany
Pages195 - 202
ISSN0302-9743
LanguagesEnglish-United Kingdom

Last updated on 2020-31-03 at 00:50