In this paper, we present a local convergence anal- ysis for a class of stochastic optimisation meth- ods: the proximal variance reduced stochastic gradient methods, and mainly focus on SAGA (Defazio et al., 2014) and Prox-SVRG (Xiao & Zhang, 2014). Under the assumption that the non-smooth component of the optimisation prob- lem is partly smooth relative to a smooth mani- fold, we present a unified framework for the local convergence analysis of SAGA/Prox-SVRG: (i) the sequences generated by the methods are able to identify the smooth manifold in a finite num- ber of iterations; (ii) then the sequence enters a local linear convergence regime. Furthermore, we discuss various possibilities for accelerating these algorithms, including adapting to better lo- cal parameters, and applying higher-order deter- ministic/stochastic optimisation methods which can achieve super-linear convergence. Several concrete examples arising from machine learning are considered to demonstrate the obtained result.
|Title of host publication||Proceedings of the 35th International Conference on Machine Learning|
|Editors||Jennifer Dy, Andreas Krause|
|Place of Publication||Stockholmsmässan, Stockholm Sweden|
|Number of pages||9|
|Publication status||Published - 1 Oct 2018|
|Name||Proceedings of Machine Learning Research|
Poon, C., Liang, J., & Schoenlieb, C. (2018). Local Convergence Properties of SAGA/Prox-SVRG and Acceleration. In J. Dy, & A. Krause (Eds.), Proceedings of the 35th International Conference on Machine Learning (Vol. 80, pp. 4124-4132). (Proceedings of Machine Learning Research; Vol. 80). Stockholmsmässan, Stockholm Sweden: PMLR.