Title: Learning and Optimization on Geodesic Metric Spaces
Date: Tuesday, Nov 21st, 2023
Time: 12:00pm-2:00pm
Location: Coda C1215 Midtown
teams.microsoft.com
Zihao Hu
Machine Learning Ph.D. Candidate
School of Computer Science
Georgia Institute of Technology
Committee
Dr. Jacob Abernethy (Advisor), School of Computer Science, Georgia Institute of Technology
Dr. Santosh Vempala, School of Computer Science, Georgia Institute of Technology
Dr. Molei Tao, School of Mathematics, Georgia Institute of Technology
Dr. Vidya Muthukumar, School of Electrical and Computer Engineering, Georgia Institute of Technology
Dr. Andre Wibisono, Department of Computer Science, Yale University
Abstract
There has been growing interest in designing optimization algorithms with convergence guarantees when the parameter space is not a Euclidean set but a Riemannian manifold. This has attracted considerable attention because it allows for the encoding of constraints, and in many cases, it addresses the non-convexity in both the feasible set and the objective function. But thus far, this has been mainly limited to vanilla convex optimization, and there has been limited research in the online setting and in minimax problems, which is the focus of the present work.
In the first part, we explore minimizing dynamic regret on Riemannian manifolds. We introduce optimistic mirror descent on manifolds in the online improper learning setting and apply the technique to establish adaptive dynamic regret bounds.
In the second part, we consider projection-free (online) optimization on Riemannian manifolds. We illustrate how to design algorithms that rely solely on a linear optimization or separation oracle and how to achieve sub-linear regret on manifolds.
Lastly, we consider the problem of the last-iterate convergence of Riemannian extragradient. The result obtained in Euclidean space using the extragradient method is O(1/sqrt{T}). In this study, we demonstrate a similar behavior in the context of the manifold setting.