Abstract
Regularization plays a pivotal role when facing the challenge of solving ill-posed inverse problems, where the number of observations is smaller than the ambient dimension of the object to be estimated. A line of recent work has studied regularization models with various types of low-dimensional structures. In such settings, the general approach is to solve a regularized optimization problem, which combines a data fidelity term and some regularization penalty that promotes the assumed low-dimensional/simple structure. This paper provides a general framework to capture this low-dimensional structure using what we call partly smooth functions relative to a linear manifold. These are convex, non-negative, closed and finite-valued functions that will promote objects living on low-dimensional subspaces. This class of regularizers encompasses many popular examples such as the ℓ1ℓ1-norm, ℓ1−ℓ2ℓ1−ℓ2-norm (group sparsity), as well as several others including the ℓ∞ℓ∞ norm. We also show that the set of partly smooth functions relative to a linear manifold is closed under addition and pre-composition by a linear operator, which allows us to cover mixed regularization, and the so-called analysis-type priors (e.g. total variation, fused Lasso, finite-valued polyhedral gauges). Our main result presents a unified sharp analysis of exact and robust recovery of the low-dimensional subspace model associated to the object to recover from partial measurements. This analysis is illustrated on a number of special and previously studied cases, and on an analysis of the performance of ℓ∞ℓ∞ regularization in a compressed sensing scenario.
Original language | English |
---|---|
Pages (from-to) | 230–287 |
Number of pages | 58 |
Journal | Information and Inference |
Volume | 4 |
Issue number | 3 |
DOIs | |
Publication status | Published - 13 Apr 2015 |