Exploring High Dimensions in Dynamical Sampling: Flattening the Scaling Curve

Yifan Chen, UCLA
11/5, 2025 at 11:10AM-12:00PM in 939 Evans (for in-person talks) and https://berkeley.zoom.us/j/98667278310

Dynamical sampling of probability distributions based on models or data (i.e., generative modeling) is a central task in scientific computing and machine learning. I will present recent work on understanding and improving algorithms in high-dimensional settings. This includes a novel "delocalization of bias" phenomenon in Langevin dynamics, where biased methods can achieve dimension-free scaling for low-dimensional marginals while unbiased methods cannot—a finding motivated by molecular dynamics simulations. I will also briefly discuss a new unbiased affine-invariant Hamiltonian sampler that outperforms popular samplers in the emcee package in high dimensions, and introduce a design of optimal Lipschitz energy for measure transport in generative modeling that leads to dimension-robust numerical performance with respect to resolution, offering an alternative to the optimal kinetic energy used in optimal transport. These examples demonstrate how dimensional scaling may be flattened, enabling efficient stochastic algorithms for high-dimensional sampling and generative modeling in relevant scientific applications.