Aggregation of Multiple Knockoffs

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Tuan-Binh Nguyen, Jerome-Alexis Chevalier, Thirion Bertrand, Sylvain Arlot

Abstract

<p>We develop an extension of the knockoff inference procedure, introduced by Barber &amp; Candes (2015). This new method, called Aggregation of Multiple Knockoffs (AKO), addresses the instability inherent to the random nature of knockoff-based inference. Specifically, AKO improves both the stability and power compared with the original knockoff algorithm while still maintaining guarantees for false discovery rate control. We provide a new inference procedure, prove its core properties, and demonstrate its benefits in a set of experiments on synthetic and real datasets.</p>