A Generic Approach for Accelerating Stochastic Zeroth-Order Convex Optimization
Refereed conference paper presented and published in conference proceedings


全文

其它資訊
摘要In this paper, we propose a generic approach for accelerating the convergence of existing algorithms to solve the problem of stochastic zeroth-order convex optimization (SZCO). Standard techniques for accelerating the convergence of stochastic zeroth-order algorithms are by exploring multiple functional evaluations (e.g., two-point evaluations), or by exploiting global conditions of the problem (e.g., smoothness and strong convexity). Nevertheless, these classic acceleration techniques are necessarily restricting the applicability of newly developed algorithms. The key of our proposed generic approach is to explore a local growth condition (or called local error bound condition) of the objective function in SZCO. The benefits of the proposed acceleration technique are: (i) it is applicable to both settings with one-point evaluation and two-point evaluations; (ii) it does not necessarily require strong convexity or smoothness condition of the objective function; (iii) it yields an improvement on convergence for a broad family of problems. Empirical studies in various settings demonstrate the effectiveness of the proposed acceleration approach.
著者Xiaotian Yu, Irwin King, Michael R. Lyu, Tianbao Yang
會議名稱27th International Joint Conference on Artificial Intelligence (IJCAI 2018)
會議開始日13.07.2018
會議完結日19.07.2018
會議地點Stockholm, Sweden
會議國家/地區瑞典
會議論文集題名Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI 2018)
出版年份2018
月份7
頁次3040 - 3046
國際標準書號978-099924112-7
國際標準期刊號10450823
語言美式英語

上次更新時間 2020-06-07 於 01:55