On the convergence and sampling of randomized primal-dual algorithms and their application to parallel MRI reconstruction

Eric B Gutierrez, Claire Delplancke, Matthias J Ehrhardt

Research output: Working paper / PreprintPreprint

Abstract

The Stochastic Primal-Dual Hybrid Gradient or SPDHG is an algorithm proposed by Chambolle et al. to efficiently solve a wide class of nonsmooth large-scale optimization problems. In this paper we contribute to its theoretical foundations and prove its almost sure convergence for convex but neither necessarily strongly convex nor smooth functionals, defined on Hilbert spaces of arbitrary dimension. We also prove its convergence for any arbitrary sampling, and for some specific samplings we propose theoretically optimal step size parameters which yield faster convergence. In addition, we propose using SPDHG for parallel Magnetic Resonance Imaging reconstruction, where data from different coils are randomly selected at each iteration. We apply SPDHG using a wide range of random sampling methods. We compare its performance across a range of settings, including mini-batch size, step size parameters, and both convex and strongly convex objective functionals. We show that the sampling can significantly affect the convergence speed of SPDHG. We conclude that for many cases an optimal sampling method can be identified.
Original languageEnglish
Publication statusPublished - 25 Jul 2022

Keywords

  • math.OC
  • cs.NA
  • math.NA

Fingerprint

Dive into the research topics of 'On the convergence and sampling of randomized primal-dual algorithms and their application to parallel MRI reconstruction'. Together they form a unique fingerprint.

Cite this