Rest-Katyusha

Exploiting the solution's structure via scheduled restart schemes

Junqi Tang, Mohammad Golbabaee, Francis Bach, Mike Davies

Research output: Contribution to journalConference article

Abstract

We propose a structure-adaptive variant of a state-of-the-art stochastic variance-reduced gradient algorithm Katyusha for regularized empirical risk minimization. The proposed method is able to exploit the intrinsic low-dimensional structure of the solution, such as sparsity or low rank which is enforced by a non-smooth regularization, to achieve even faster convergence rate. This provable algorithmic improvement is done by restarting the Katyusha algorithm according to restricted strong-convexity (RSC) constants. We also propose an adaptive-restart variant which is able to estimate the RSC on the fly and adjust the restart period automatically. We demonstrate the effectiveness of our approach via numerical experiments.

Original languageEnglish
Pages (from-to)429-440
Number of pages12
JournalAdvances in Neural Information Processing Systems
Publication statusPublished - 1 Jan 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: 2 Dec 20188 Dec 2018

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Rest-Katyusha : Exploiting the solution's structure via scheduled restart schemes. / Tang, Junqi; Golbabaee, Mohammad; Bach, Francis; Davies, Mike.

In: Advances in Neural Information Processing Systems, 01.01.2018, p. 429-440.

Research output: Contribution to journalConference article

@article{a096dc5823b64165b1d53d5d7d7d2bdf,
title = "Rest-Katyusha: Exploiting the solution's structure via scheduled restart schemes",
abstract = "We propose a structure-adaptive variant of a state-of-the-art stochastic variance-reduced gradient algorithm Katyusha for regularized empirical risk minimization. The proposed method is able to exploit the intrinsic low-dimensional structure of the solution, such as sparsity or low rank which is enforced by a non-smooth regularization, to achieve even faster convergence rate. This provable algorithmic improvement is done by restarting the Katyusha algorithm according to restricted strong-convexity (RSC) constants. We also propose an adaptive-restart variant which is able to estimate the RSC on the fly and adjust the restart period automatically. We demonstrate the effectiveness of our approach via numerical experiments.",
author = "Junqi Tang and Mohammad Golbabaee and Francis Bach and Mike Davies",
year = "2018",
month = "1",
day = "1",
language = "English",
pages = "429--440",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",
publisher = "MIT Press",

}

TY - JOUR

T1 - Rest-Katyusha

T2 - Exploiting the solution's structure via scheduled restart schemes

AU - Tang, Junqi

AU - Golbabaee, Mohammad

AU - Bach, Francis

AU - Davies, Mike

PY - 2018/1/1

Y1 - 2018/1/1

N2 - We propose a structure-adaptive variant of a state-of-the-art stochastic variance-reduced gradient algorithm Katyusha for regularized empirical risk minimization. The proposed method is able to exploit the intrinsic low-dimensional structure of the solution, such as sparsity or low rank which is enforced by a non-smooth regularization, to achieve even faster convergence rate. This provable algorithmic improvement is done by restarting the Katyusha algorithm according to restricted strong-convexity (RSC) constants. We also propose an adaptive-restart variant which is able to estimate the RSC on the fly and adjust the restart period automatically. We demonstrate the effectiveness of our approach via numerical experiments.

AB - We propose a structure-adaptive variant of a state-of-the-art stochastic variance-reduced gradient algorithm Katyusha for regularized empirical risk minimization. The proposed method is able to exploit the intrinsic low-dimensional structure of the solution, such as sparsity or low rank which is enforced by a non-smooth regularization, to achieve even faster convergence rate. This provable algorithmic improvement is done by restarting the Katyusha algorithm according to restricted strong-convexity (RSC) constants. We also propose an adaptive-restart variant which is able to estimate the RSC on the fly and adjust the restart period automatically. We demonstrate the effectiveness of our approach via numerical experiments.

UR - http://www.scopus.com/inward/record.url?scp=85064836500&partnerID=8YFLogxK

M3 - Conference article

SP - 429

EP - 440

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -