TY - JOUR
T1 - A sparse optimization approach to infinite infimal convolution regularization
AU - Bredies, Kristian
AU - Carioni, Marcello
AU - Holler, Martin
AU - Korolev, Yury
AU - Schönlieb, Carola-Bibiane
PY - 2025/2/28
Y1 - 2025/2/28
N2 - In this paper we introduce the class of infinite infimal convolution functionals and apply these functionals to the regularization of ill-posed inverse problems. The proposed regularization involves an infimal convolution of a continuously parametrized family of convex, positively one-homogeneous functionals defined on a common Banach space $X$. We show that, under mild assumptions, this functional admits an equivalent convex lifting in the space of measures with values in $X$. This reformulation allows us to prove well-posedness of a Tikhonov regularized inverse problem and opens the door to a sparse analysis of the solutions. In the case of finite-dimensional measurements we prove a representer theorem, showing that there exists a solution of the inverse problem that is sparse, in the sense that it can be represented as a linear combination of the extremal points of the ball of the lifted infinite infimal convolution functional. Then, we design a generalized conditional gradient method for computing solutions of the inverse problem without relying on an a priori discretization of the parameter space and of the Banach space $X$. The iterates are constructed as linear combinations of the extremal points of the lifted infinite infimal convolution functional. We prove a sublinear rate of convergence for our algorithm and apply it to denoising of signals and images using, as regularizer, infinite infimal convolutions of fractional-Laplacian-type operators with adaptive orders of smoothness and anisotropies.
AB - In this paper we introduce the class of infinite infimal convolution functionals and apply these functionals to the regularization of ill-posed inverse problems. The proposed regularization involves an infimal convolution of a continuously parametrized family of convex, positively one-homogeneous functionals defined on a common Banach space $X$. We show that, under mild assumptions, this functional admits an equivalent convex lifting in the space of measures with values in $X$. This reformulation allows us to prove well-posedness of a Tikhonov regularized inverse problem and opens the door to a sparse analysis of the solutions. In the case of finite-dimensional measurements we prove a representer theorem, showing that there exists a solution of the inverse problem that is sparse, in the sense that it can be represented as a linear combination of the extremal points of the ball of the lifted infinite infimal convolution functional. Then, we design a generalized conditional gradient method for computing solutions of the inverse problem without relying on an a priori discretization of the parameter space and of the Banach space $X$. The iterates are constructed as linear combinations of the extremal points of the lifted infinite infimal convolution functional. We prove a sublinear rate of convergence for our algorithm and apply it to denoising of signals and images using, as regularizer, infinite infimal convolutions of fractional-Laplacian-type operators with adaptive orders of smoothness and anisotropies.
KW - 35R11
KW - 49J45
KW - 65J20
KW - 65K10
UR - http://www.scopus.com/inward/record.url?scp=85209714558&partnerID=8YFLogxK
U2 - 10.1007/s00211-024-01439-2
DO - 10.1007/s00211-024-01439-2
M3 - Article
SN - 0029-599X
VL - 157
SP - 41
EP - 96
JO - Numerische Mathematik
JF - Numerische Mathematik
IS - 1
ER -