Minimax Rate for Learning From Pairwise Comparisons in the BTL Model

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »

Bibtek download is not availble in the pre-proceeding


Julien Hendrickx, Alex Olshevsky, Venkatesh Saligrama


<p>We consider the problem of learning the qualities w<em>1, ... , w</em>n of a collection of items by performing noisy comparisons among them. We assume there is a fixed ``comparison graph'' and every neighboring pair of items is compared k times. We will study the popular Bradley-Terry-Luce model, where the probability that item i wins a comparison against j equals w<em>i/(w</em>i + w<em>j). We are interested in how the expected error in estimating the vector w = (w</em>1, ... , w_n) behaves in the regime when the number of comparisons k is large.</p> <p>Our contribution is the determination of the minimax rate up to a constant factor. We show that this rate is achieved by a simple algorithm based on weighted least squares, with weights determined from the empirical outcomes of the comparisons. This algorithm can be implemented in nearly linear time in the total number of comparisons.</p>