A Comparative Study on Data Smoothing Regularization for Local Factor Analysis
Refereed conference paper presented and published in conference proceedings


Full Text

Times Cited
Web of Science0WOS source URL (as at 20/05/2020) Click here for the latest count

Other information
AbstractSelecting the cluster number and the hidden factor numbers of Local Factor Analysis (LFA) model is a typical model selection problem, which is difficult when the sample size is finite or small. Data smoothing is one of the three regularization techniques integrated in the statistical learning framework. Bayesian Ying-Yang (BYY) harmony learning theory, to improve parameter learning and model selection. In this paper, we will comparatively investigate the performance of five existing formulas to determine the hyper-parameter namely the smoothing parameter that controls the strength of data smoothing regularization. BYY learning algorithms on LFA using these formulas are evaluated by model selection accuracy on simulated data and classification accuracy on real world data. Two observations are obtained. First, learning with data smoothing works better than that without it especially when sample size is small. Second, the gradient method derived from imposing a sample set based improper prior on the smoothing parameter generally outperforms other methods such as the one from Gamma or Chi-square prior, and the one under the equal covariance priniciple.
All Author(s) ListTu SK, Shi L, Xu L
Name of Conference18th International Conference on Arificial Neural Networks (ICANN 2008)
Start Date of Conference03/09/2008
End Date of Conference06/09/2008
Place of ConferencePrague
Country/Region of ConferenceCzech Republic
Journal nameLecture Notes in Artificial Intelligence
Detailed descriptioned. by V. Kurkov´a et al. .
Year2008
Month1
Day1
Volume Number5163
PublisherSPRINGER-VERLAG BERLIN
Pages265 - 274
ISBN978-3-540-87535-2
ISSN0302-9743
LanguagesEnglish-United Kingdom
Web of Science Subject CategoriesComputer Science; Computer Science, Theory & Methods

Last updated on 2020-21-05 at 01:57