An Accelerated DFO Algorithm for Finite-sum Convex Functions

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Yuwen Chen, Antonio Orvieto, Aurelien Lucchi

Abstract

<p>Derivative-free optimization (DFO) has recently gained a lot of momentum in machine learning, spawning interest in the community to design faster methods for problems where gradients are not accessible. While some attention has been given to the concept of acceleration in the DFO literature, there exists no algorithm with a provably accelerated rate of convergence for objective functions with a finite-sum structure. Stochastic algorithms that use acceleration in such a setting are prone to instabilities, making it difficult to reach convergence. In this work, we exploit the finite-sum structure of the objective to design a variance-reduced DFO algorithm that probably yields an accelerated rate of convergence. We prove rates of convergence for both smooth convex and strongly-convex finite-sum objective functions. Finally, we validate our theoretical results empirically on several datasets.</p>