Local Convergence Properties of SAGA/Prox-SVRG and Acceleration

Clarice Poon, Jingwei Liang, Carola Schoenlieb

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

Abstract

In this paper, we present a local convergence anal- ysis for a class of stochastic optimisation meth- ods: the proximal variance reduced stochastic gradient methods, and mainly focus on SAGA (Defazio et al., 2014) and Prox-SVRG (Xiao & Zhang, 2014). Under the assumption that the non-smooth component of the optimisation prob- lem is partly smooth relative to a smooth mani- fold, we present a unified framework for the local convergence analysis of SAGA/Prox-SVRG: (i) the sequences generated by the methods are able to identify the smooth manifold in a finite num- ber of iterations; (ii) then the sequence enters a local linear convergence regime. Furthermore, we discuss various possibilities for accelerating these algorithms, including adapting to better lo- cal parameters, and applying higher-order deter- ministic/stochastic optimisation methods which can achieve super-linear convergence. Several concrete examples arising from machine learning are considered to demonstrate the obtained result.
Original languageEnglish
Title of host publicationProceedings of the 35th International Conference on Machine Learning
EditorsJennifer Dy, Andreas Krause
Place of PublicationStockholmsmässan, Stockholm Sweden
PublisherPMLR
Pages4124-4132
Number of pages9
Volume80
Publication statusPublished - 1 Oct 2018

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume80
ISSN (Print)1938-7228
ISSN (Electronic)2640-3498

Fingerprint

Dive into the research topics of 'Local Convergence Properties of SAGA/Prox-SVRG and Acceleration'. Together they form a unique fingerprint.

Cite this