Laplacian Regularized Few-Shot Learning

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »

Bibtek download is not availble in the pre-proceeding


Authors

Imtiaz Ziko, Jose Dolz, Eric Granger, Ismail Ben Ayed

Abstract

<p>Few-shot learning attempts to generalize to unlabeled query samples of new classes, which are unseen during training, given just a few labeled examples of those classes. It has received substantial research interest recently, with a large body of works based on complex meta-learning strategies and architecture choices. We propose a Laplacian-regularization objective for few-shot tasks, which integrates two types of potentials: (1) unary potentials assigning query samples to the nearest class prototype and (2) pairwise Laplacian potentials encouraging nearby query samples to have consistent predictions.We optimize a tight upper bound of a concave-convex relaxation of our objective, thereby guaranteeing convergence, while computing independent updates for each query sample. Following the standard experimental setting for few-shot learning, our LaplacianShot technique outperforms state-of-the-art methods significantly, while using simple cross-entropy training on the base classes. In the 1-shot setting on the standard miniImageNet and tieredImageNet benchmarks, and on the recent meta-iNat benchmark, across various networks, LaplacianShot consistently pro-vides 3 − 4% improvement in accuracy over the best-performing state-of-the-art method.</p>