Group sparse optimization via ℓp,q regularization
Publication in refereed journal

香港中文大學研究人員

全文

其它資訊
摘要In this paper, we investigate a group sparse optimization problem via l(p),(q) regularization in three aspects: theory, algorithm and application. In the theoretical aspect, by introducing a notion of group restricted eigenvalue condition, we establish an oracle property and a global recovery bound of order O(lambda(2/2-q)) for any point in a level set of the l(p),(q) regularization problem, and by virtue of modern variational analysis techniques, we also provide a local analysis of recovery bound of order O (lambda(2)) for a path of local minima. In the algorithmic aspect, we apply the well-known proximal gradient method to solve the l(p,q) regularization problems, either by analytically solving some specific l(p,q) regularization subproblems, or by using the Newton method to solve general l(p),(q) regularization subproblems. In particular, we establish a local linear convergence rate of the proximal gradient method for solving the l(1,q) regularization problem under some mild conditions and by first proving a second-order growth condition. As a consequence, the local linear convergence rate of proximal gradient method for solving the usual l(q) regularization problem (0 < q < 1) is obtained. Finally in the aspect of application, we present some numerical results on both the simulated data and the real data in gene transcriptional regulation.
出版社接受日期17.04.2017
著者Yaohua Hu, Chong Li, Kaiwen Meng, Jing Qin, Xiaoqi Yang
期刊名稱Journal of Machine Learning Research
出版年份2017
月份4
卷號18
頁次1 - 52
國際標準期刊號1532-4435
語言美式英語

上次更新時間 2021-08-10 於 23:37