Abstract
The optimisation of nonsmooth, nonconvex functions without access to gradients is a particularly challenging problem that is frequently encountered, for example in model parameter optimisation problems. Bilevel optimisation of parameters is a standard setting in areas such as variational regularisation problems and supervised machine learning. We present efficient and robust derivative-free methods called randomised Itoh–Abe methods. These are generalisations of the Itoh–Abe discrete gradient method, a well-known scheme from geometric integration, which has previously only been considered in the smooth setting. We demonstrate that the method and its favourable energy dissipation properties are well defined in the nonsmooth setting. Furthermore, we prove that whenever the objective function is locally Lipschitz continuous, the iterates almost surely converge to a connected set of Clarke stationary points. We present an implementation of the methods, and apply it to various test problems. The numerical results indicate that the randomised Itoh–Abe methods can be superior to state-of-the-art derivative-free optimisation methods in solving nonsmooth problems while still remaining competitive in terms of efficiency.
Original language | English |
---|---|
Pages (from-to) | 1351–1394 |
Journal | Foundations of Computational Mathematics |
Volume | 22 |
Early online date | 29 Jul 2021 |
DOIs | |
Publication status | Published - 31 Oct 2022 |
Bibliographical note
Funding Information:All authors acknowledges support from the European Union Horizon 2020 research and innovation programmes under the Marie Skłodowska-Curie Grant Agreement No. 691070. ESR, MJE, and CBS acknowledges support from the Cantab Capital Institute for the Mathematics of Information. ESR acknowledges support from London Mathematical Society. MJE and CBS acknowledges support from Leverhulme Trust project “Breaking the non-convexity barrier”, EPSRC Grant “EP/M00483X/1”, and EPSRC centre “EP/N014588/1”. GRWQ acknowledges support from the Australian Research Council and is grateful to the Mittag–Leffler Institute for a productive stay. Moreover, CBS acknowledges support from the RISE project NoMADS and the Alan Turing Institute. MJE acknowledges support from the EPSRC (EP/S026045/1) and the Faraday Institution (EP/T007745/1)
Funding
All authors acknowledges support from the European Union Horizon 2020 research and innovation programmes under the Marie Skłodowska-Curie Grant Agreement No. 691070. ESR, MJE, and CBS acknowledges support from the Cantab Capital Institute for the Mathematics of Information. ESR acknowledges support from London Mathematical Society. MJE and CBS acknowledges support from Leverhulme Trust project “Breaking the non-convexity barrier”, EPSRC Grant “EP/M00483X/1”, and EPSRC centre “EP/N014588/1”. GRWQ acknowledges support from the Australian Research Council and is grateful to the Mittag–Leffler Institute for a productive stay. Moreover, CBS acknowledges support from the RISE project NoMADS and the Alan Turing Institute. MJE acknowledges support from the EPSRC (EP/S026045/1) and the Faraday Institution (EP/T007745/1)
Keywords
- Bilevel optimisation
- Clarke subdifferential
- Derivative-free optimisation
- Discrete gradient methods
- Geometric numerical integration
- Nonconvex optimisation
- Nonsmooth optimisation
ASJC Scopus subject areas
- Analysis
- Computational Mathematics
- Computational Theory and Mathematics
- Applied Mathematics
Fingerprint
Dive into the research topics of 'A Geometric Integration Approach to Nonsmooth, Nonconvex Optimisation'. Together they form a unique fingerprint.Profiles
-
Matthias Ehrhardt
- Centre for Mathematics and Algorithms for Data (MAD)
- Department of Mathematical Sciences - Reader
- Centre for Nonlinear Mechanics
- Bath Institute for the Augmented Human
- Centre for Bioengineering & Biomedical Technologies (CBio)
Person: Research & Teaching, Researcher, Affiliate staff