Analysis of Primal Dual Algorithms for Nonsmooth Convex Optimization

Student thesis: Doctoral ThesisPhD

Abstract

First-order iterative methods are effective low-cost tools able to solve a wide range of optimization problems. These methods have become increasingly relevant in recent times, as modern data availability dramatically increases the size and complexity of modern problems. As these challenges increase, so does the interest in algorithms with low computational cost and memory requirement. Stochastic optimization methods further decrease the complexity of iterations by randomly sampling over data.
Furthermore, in many applications there is a growing need to find sparse solutions to high-dimensional problems as they are easier to interpret and require less storage. Nonsmooth optimization methods are especially useful for finding sparse solutions because they naturally incorporate sparsity-inducing regularization terms into these problems.
Within this context we look into first-order primal-dual methods, as they have low computational cost and do not require differentiability, hence are able to deal with large-scale nonsmooth problems. In this document we further investigate two wellknown first-order primal-dual methods and their applications to real-life examples.
The first method is the celebrated Primal-Dual Hybrid Gradient (PDHG). We further study its convergence for a particular class of convex functionals in order to predict faster theoretical convergence rates. In contrast with previous convergence results, our newfound convergence rates exactly predict the performance of the algorithm.
The second is Stochastic Primal-Dual Hybrid Gradient (SPDHG), a randomized version of PDHG designed to reduce its computational cost. We contribute to its theoretical foundations by closing a previously existing gap on its convergence theory. In particular, we prove its convergence for any arbitrary random sampling, and we discuss optimal step size parameters for different samplings.
In addition, as a novel application we investigate Magnetic Resonance Imaging (MRI) using SPDHG. Parallel MRI models naturally partition the data into subsets which the algorithm samples randomly at every iteration. We use real MRI data to compare the performance of SPDHG across different settings and different sampling techniques. Our numerical examples show that the type of sampling directly affects performance of the algorithm, and that information on the physical properties of the MRI process can help determine an optimal sampling.
Date of Award11 Oct 2023
Original languageEnglish
Awarding Institution
  • University of Bath
SupervisorMatthias Ehrhardt (Supervisor) & Euan Spence (Supervisor)

Cite this

'