A Generic Approach for Accelerating Stochastic Zeroth-Order Convex Optimization
Refereed conference paper presented and published in conference proceedings


Full Text

Times Cited

Other information
AbstractIn this paper, we propose a generic approach for accelerating the convergence of existing algorithms to solve the problem of stochastic zeroth-order convex optimization (SZCO). Standard techniques for accelerating the convergence of stochastic zeroth-order algorithms are by exploring multiple functional evaluations (e.g., two-point evaluations), or by exploiting global conditions of the problem (e.g., smoothness and strong convexity). Nevertheless, these classic acceleration techniques are necessarily restricting the applicability of newly developed algorithms. The key of our proposed generic approach is to explore a local growth condition (or called local error bound condition) of the objective function in SZCO. The benefits of the proposed acceleration technique are: (i) it is applicable to both settings with one-point evaluation and two-point evaluations; (ii) it does not necessarily require strong convexity or smoothness condition of the objective function; (iii) it yields an improvement on convergence for a broad family of problems. Empirical studies in various settings demonstrate the effectiveness of the proposed acceleration approach.
All Author(s) ListXiaotian Yu, Irwin King, Michael R. Lyu, Tianbao Yang
Name of Conference27th International Joint Conference on Artificial Intelligence (IJCAI 2018)
Start Date of Conference13/07/2018
End Date of Conference19/07/2018
Place of ConferenceStockholm, Sweden
Country/Region of ConferenceSweden
Proceedings TitleProceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI 2018)
Year2018
Month7
Pages3040 - 3046
ISBN978-099924112-7
ISSN10450823
LanguagesEnglish-United States

Last updated on 2020-28-06 at 02:13