Structure-Adaptive, Variance-Reduced, and Accelerated Stochastic Optimization

Junqi Tang, Francis Bach, Mohammad Golbabaee, Mike Davies

Research output: Contribution to journalArticle

35 Downloads (Pure)


In this work we explore the fundamental structure-adaptiveness of state of the art randomized first order algorithms on regularized empirical risk minimization tasks, where the solution has intrinsic low-dimensional structure (such as sparsity and low-rank). Such structure is often enforced by non-smooth regularization or constraints. We start by establishing the fast linear convergence rate of the SAGA algorithm on non-strongly-convex objectives with convex constraints, via an argument of cone-restricted strong convexity. Then for the composite minimization task with a coordinate-wise separable convex regularization term, we propose and analyse a two stage accelerated coordinate descend algorithm (Two-Stage APCG). We provide the convergence analysis showing that the proposed method has a global convergence in general and enjoys a local accelerated linear convergence rate with respect to the low-dimensional structure of the solution. Then based on this convergence result, we proposed an adaptive variant of the two-stage APCG method which does not need to foreknow the restricted strong convexity beforehand, but estimate it on the fly. In numerical experiments we compare the adaptive two-stage APCG with various state of the art variance-reduced stochastic gradient methods on sparse regression tasks, and demonstrate the effectiveness of our approach.
Original languageEnglish
JournalPreprint on arXiv
Publication statusPublished - 8 Dec 2017

Bibliographical note

29 Pages


  • math.OC


Dive into the research topics of 'Structure-Adaptive, Variance-Reduced, and Accelerated Stochastic Optimization'. Together they form a unique fingerprint.

Cite this