Abstract

We consider a bilevel learning framework for learning linear operators. In this framework, the learnable parameters are optimized via a loss function that also depends on the minimizer of a convex optimization problem (denoted lower-level problem). We utilize an iterative algorithm called `piggyback' to compute the gradient of the loss and minimizer of the lower-level problem. Given that the lower-level problem is solved numerically, the loss function and thus its gradient can only be computed inexactly. To estimate the accuracy of the computed hypergradient, we derive an a-posteriori error bound, which provides guides for setting the tolerance for the lower-level problem, as well as the piggyback algorithm. To efficiently solve the upper-level optimization, we also propose an adaptive method for choosing a suitable step-size. To illustrate the proposed method, we consider a few learned regularizer problems, such as training an input-convex neural network.
Original languageEnglish
Article number49
JournalJournal of Mathematical Imaging and Vision
Volume67
Issue number5
Early online date23 Aug 2025
DOIs
Publication statusPublished - Oct 2025

Bibliographical note

Publisher Copyright:
© The Author(s) 2025.

Data Availability Statement

No datasets were generated or analyzed during the current study.

Funding

The work of Mohammad Sadegh Salehi was supported by a scholarship from the EPSRC Centre for Doctoral Training in Statistical Applied Mathematics at Bath (SAMBa), under the Project EP/S022945/1. Matthias J. Ehrhardt acknowledges support from the EPSRC (EP/S026045/1, EP/T026693/1, EP/V026259/1). Hok Shing Wong acknowledges support from the Project EP/V026259/1.

FundersFunder number
Centre for Doctoral Training in Statistical Applied Mathematics, University of BathEP/S022945/1
Engineering and Physical Sciences Research CouncilEP/V026259/1, EP/S026045/1, EP/T026693/1

Keywords

  • Bilevel learning
  • Input-convex neural networks
  • Machine learning
  • Piggyback algorithm
  • Saddle-point problems

ASJC Scopus subject areas

  • Statistics and Probability
  • Modelling and Simulation
  • Condensed Matter Physics
  • Computer Vision and Pattern Recognition
  • Geometry and Topology
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'An Adaptively Inexact Method for Bilevel Learning Using Primal-Dual Style Differentiation'. Together they form a unique fingerprint.

Cite this