Flexible Krylov Methods for Lp regularization

Julianne Chung, Silvia Gazzola

Research output: Contribution to journalArticle

11 Downloads (Pure)

Abstract

In this paper we develop flexible Krylov methods for efficiently computing regularized solutions to large-scale linear inverse problems with an `2 fit-to-data term and an `p penalization term, for p ≥ 1. First we approximate the p-norm penalization term as a sequence of 2-norm penalization terms using adaptive regularization matrices in an iterative reweighted norm fashion, and then we exploit flexible preconditioning techniques to efficiently incorporate the weight updates. To handle general (nonsquare) `p-regularized least-squares problems, we introduce a flexible Golub–Kahan approach and exploit it within a Krylov–Tikhonov hybrid framework. Furthermore, we show that both the flexible Golub–Kahan and the flexible Arnoldi approaches for p = 1 can be used to efficiently compute solutions that are sparse with respect to some transformations. The key benefits of our approach compared to existing optimization methods for `p regularization are that inner-outer iteration
schemes are replaced by efficient projection methods on linear subspaces of increasing dimension and that expensive regularization parameter selection techniques can be avoided. Theoretical insights are provided, and numerical results from image deblurring and tomographic reconstruction illustrate the benefits of this approach, compared to well-established methods.
Original languageEnglish
Pages (from-to)S149-S171
Number of pages23
JournalSIAM Journal on Scientific Computing
Volume41
Issue number5
DOIs
Publication statusPublished - 29 Oct 2019

Keywords

  • Flexible Golub–Kahan
  • Hybrid regularization
  • Image deblurring
  • Iterative reweighted norm
  • Sparsity reconstruction
  • `p regularization

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics

Cite this