Dimensionality reduction with generalized linear models
Refereed conference paper presented and published in conference proceedings


Full Text

Times Cited

Other information
AbstractIn this paper, we propose a general dimensionality reduction method for data generated from a very broad family of distributions and nonlinear functions based on the generalized linear model, called Generalized Linear Principal Component Analysis (GLPCA). Data of different domains often have very different structures. These data can be modeled by different distributions and reconstruction functions. For example, real valued data can be modeled by the Gaussian distribution with a linear reconstruction function, whereas binary valued data may be more appropriately modeled by the Bernoulli distribution with a logit or probit function. Based on general linear models, we propose a unified framework for extracting features from data of different domains. A general optimization algorithm based on natural gradient ascent on distribution manifold is proposed for obtaining the maximum likelihood solutions. We also present some specific algorithms derived from this framework to deal with specific data modeling problems such as document modeling. Experimental results of these algorithms on several data sets are shown for the validation of GLPCA.
All Author(s) ListChen M., Li W., Zhang W., Wang X.
Name of Conference23rd International Joint Conference on Artificial Intelligence, IJCAI 2013
Start Date of Conference03/08/2013
End Date of Conference09/08/2013
Place of ConferenceBeijing
Country/Region of ConferenceChina
Year2013
Month12
Day1
Pages1267 - 1272
ISBN9781577356332
ISSN1045-0823
LanguagesEnglish-United Kingdom

Last updated on 2020-05-09 at 02:00