Robust Multiview Subspace Learning With Nonindependently and Nonidentically Distributed Complex Noise
Publication in refereed journal

Times Cited
Altmetrics Information

Other information
AbstractMultiview Subspace Learning (MSL), which aims at obtaining a low-dimensional latent subspace from multiview data, has been widely used in practical applications. Most recent MSL approaches, however, only assume a simple independent identically distributed (i.i.d.) Gaussian or Laplacian noise for all views of data, which largely underestimates the noise complexity in practical multiview data. Actually, in real cases, noises among different views generally have three specific characteristics. First, in each view, the data noise always has a complex configuration beyond a simple Gaussian or Laplacian distribution. Second, the noise distributions of different views of data are generally nonidentical and with evident distinctiveness. Third, noises among all views are nonindependent but obviously correlated. Based on such understandings, we elaborately construct a new MSL model by more faithfully and comprehensively considering all these noise characteristics. First, the noise in each view is modeled as a Dirichlet process (DP) Gaussian mixture model (DPGMM), which can fit a wider range of complex noise types than conventional Gaussian or Laplacian. Second, the DPGMM parameters in each view are different from one another, which encodes the ``nonidentical'' noise property. Third, the DPGMMs on all views share the same high-level priors by using the technique of hierarchical DP, which encodes the ``nonindependent'' noise property. All the aforementioned ideas are incorporated into an integrated graphics model which can be appropriately solved by the variational Bayes algorithm. The superiority of the proposed method is verified by experiments on 3-D reconstruction simulations, multiview face modeling, and background subtraction, as compared with the current state-of-the-art MSL methods.
Acceptance Date19/06/2019
All Author(s) ListYue Zongsheng, Yong Hongwei, Meng Deyu, Zhao Qian, Leung Yee, Zhang Lei
Journal nameIEEE Transactions on Neural Networks and Learning Systems
Volume Number31
Issue Number4
Pages1070 - 1083
LanguagesEnglish-United States
KeywordsData models, Laplace equations, Adaptation models, Distributed databases, Feature extraction, Correlation, Robustness, Dirichlet process (DP) mixture model, hierarchical Dirichlet process (HDP), multiview, subspace learning, variational Bayes

Last updated on 2020-03-08 at 04:14