Near-Tight Margin-Based Generalization Bounds for Support Vector Machines

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »

Bibtek download is not availble in the pre-proceeding


Authors

Allan Grønlund, Lior Kamma, Kasper Green Larsen

Abstract

<p>Support Vector Machines (SVMs) are among the most fundamental tools for binary classification. </p> <p>In its simplest formulation, an SVM produces a hyperplane separating two classes of data using the largest possible margin to the data. The focus on maximizing the margin has been well motivated through numerous generalization bounds. </p> <p>In this paper, we revisit and improve the classic generalization bounds in terms of margins. Furthermore, we complement our new generalization bound by a nearly matching lower bound, thus almost settling the generalization performance of SVMs in terms of margins.</p>