Convergence Properties of a Randomized Primal-Dual Algorithm with Applications to Parallel MRI

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Downloads (Pure)

Abstract

The Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is an efficient algorithm to solve some nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. We also look into its application to parallel Magnetic Resonance Imaging reconstruction in order to test performance of SPDHG. Our numerical results show that for a range of settings SPDHG converges significantly faster than its deterministic counterpart.

Original languageEnglish
Title of host publicationScale Space and Variational Methods in Computer Vision - 8th International Conference, SSVM 2021, Proceedings
EditorsAbderrahim Elmoataz, Jalal Fadili, Yvain Quéau, Julien Rabin, Loïc Simon
PublisherSpringer Science and Business Media Deutschland GmbH
Pages254-266
Number of pages13
ISBN (Print)9783030755485
DOIs
Publication statusPublished - 2021
Event8th International Conference on Scale Space and Variational Methods in Computer Vision, SSVM 2021 - Virtual, Online
Duration: 16 May 202120 May 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12679 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference8th International Conference on Scale Space and Variational Methods in Computer Vision, SSVM 2021
CityVirtual, Online
Period16/05/2120/05/21

Keywords

  • Convex optimization
  • Inverse problems
  • Parallel magnetic resonance imaging
  • Primal-dual algorithm
  • Stochastic optimization

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this