Abstract
Optimal transport (OT) distances, particularly entropic-regularised OT distances, have gained popularity in machine learning and data science, driven by efficient algorithms like the Sinkhorn algorithm. Recent research has introduced stochastic variants that handle continuous data streams directly.This thesis makes three key contributions.
First, we revisit the recently introduced online Sinkhorn algorithm and improve the convergence analysis of the online Sinkhorn algorithm, demonstrating a faster convergence rate than previously established, under specific parameter settings. We also present numerical results to verify the accuracy of our theoretical findings and further show the theoretical convergence of the marginal constraints and the optimal transport distance.
Second, we introduce compressed online Sinkhorn algorithms, which integrate measure compression techniques with the online Sinkhorn algorithm. We develop two versions of the compressed algorithm: one that compresses
the discrete form of the potentials and another that compresses the newly drawn samples. Through numerical experiments, we demonstrate practical performance improvements and provide theoretical guarantees for the
efficiency of our approach.
Finally, we revisit the Gromov-Wasserstein distance and the co-optimal transport framework, offering theoretical proof of convergence for the block coordinate descent algorithm used in the co-optimal transport setting.
Date of Award | 26 Mar 2025 |
---|---|
Original language | English |
Awarding Institution |
|
Sponsors | China Scholarship Council & Engineering and Physical Sciences Research Council |
Supervisor | Clarice Poon (Supervisor) & Tony Shardlow (Supervisor) |