Divergence, Gibbs measures, and entropic regularizations of optimal transport
Consider the Monge-Kantorovich problem of transporting one density to another on Euclidean space with a convex cost. This is an infinite dimensional linear programming problem. Due to various advantages, from computational feasibility to geometry, the problem is often regularized by an entropic penalization. Solutions to this entropic regularized problem are called Schrodinger bridges.
We show that Schrodinger bridges are essentially a Gibbs measure with an "energy" given by a quantity called divergence. Divergence is a nonnegative quantity that generalizes slackness in the dual Kantorovich problem in finite linear programing. This idea has many uses. For example, consider the difference between regularized entropic cost and the actual cost of transport. Using higher order large deviations we show that the difference, properly scaled, is always given by the relative entropy of the target density with respect to a Riemannian volume measure that measures the local sensitivity of the Monge map.
In the special case of the quadratic Wasserstein transport, this relative entropy is exactly one half of the difference of entropies of the two densities. More surprisingly, we demonstrate that this difference of two entropies (plus the cost) is also the limit for the Dirichlet transport. The latter can be thought of as a multiplicative analog of the Wasserstein transport and is inspired by mathematical finance. The result hints at an underlying "gradient flow of entropy", in the sense of Jordan-Kinderlehrer-Otto, even when the cost function is not a metric.