Wasserstein Mirror Gradient Flows as the Limit of the Sinkhorn Algorithm presented by Nabarun Deb
Abstract: The Sinkhorn algorithm for computing optimal transport has gained immense prominence in recent years thanks to its scalability. In this talk, we study the sequence of marginals obtained from iterations of the Sinkhorn (or IPFP) algorithm and show that under a natural time and regularization scaling, the marginals converge to an absolutely continuous curve on the Wasserstein space. This limit, which we call the Sinkhorn flow, is an example of a Wasserstein mirror gradient flow, a concept we introduce here inspired by the well-known Euclidean mirror gradient flows. In the case of Sinkhorn, the optimization is of the relative entropy functional with respect to one of the marginals and the mirror function is half of the squared Wasserstein distance functional from the other marginal. Interestingly, the norm of the velocity field of this flow can be interpreted as the metric derivative with respect to the linearized optimal transport (LOT) distance. We provide examples to show that these flows can have faster convergence rates than usual gradient flows. We also construct a Mckean-Vlasov SDE whose marginal distributions give rise to the same flow.