Optimization, Sampling, and Generative Modeling in Non-Euclidean Spaces

Molei Tao, Georgia Institute of Technology
4/10, 2024 at 11:10AM-12:00PM in 939 Evans (for in-person talks) and https://berkeley.zoom.us/j/98667278310

Machine learning in non-Euclidean spaces have been rapidly attracting attention in recent years, and this talk will give some examples of progress on its mathematical and algorithmic foundations. A sequence of developments that eventually leads to non-Euclidean generative modeling will be reported.

More precisely, I will begin with variational optimization, which, together with delicate interplays between continuous- and discrete-time dynamics, enables the construction of momentum-accelerated algorithms that optimize functions defined on manifolds. Selected applications, namely a generic improvement of Transformer, and a low-dim. approximation of high-dim. optimal transport distance, will be described. Then I will turn the optimization dynamics into an algorithm that samples from probability distributions on Lie groups. If time permits, the performance of this sampler will also be quantified, without log-concavity condition or its common relaxations. Finally, I will describe how this sampler can lead to a structurally-pleasant diffusion generative model that allows users to, given training data that follow any latent statistical distribution on a Lie group, generate more data exactly on the same manifold that follow the same distribution. Applications such as to quantum data will be briefly mentioned.