Differentiable Ranking and Sorting using Optimal Transport
We propose in this paper proxy operators for ranking and sorting that are differentiable. To do so, we leverage the fact that sorting can be seen as a particular instance of the optimal transport (OT) problem between two univariate uniform probability measures, the first measure being supported on the family of input values of interest, the second supported on a family of increasing target values (e.g. 1,2,...,n) if the input array has n elements. Building upon this link, we propose generalized rank and sort operators by considering arbitrary target measures supported on m values, where m can be smaller than n. We recover differentiable algorithms by adding to this generic OT problem an entropic regularization, and approximate its outputs using Sinkhorn iterations. To illustrate the versatility of these operators, we use the soft-rank operator to propose a new classification training loss that is a differentiable proxy of the 0/1 loss. Using the soft-sort operator, we propose a new differentiable loss for trimmed regression.