Latent Bernoulli Autoencoder

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Jiri Fajtl, Vasileios Argyriou, Dorothy Monekosso, Paolo Remagnino

Abstract

<p>In this work, we pose a question whether it is possible to design and train an autoencoder model in an end-to-end fashion to learn latent representations in multivariate Bernoulli space, and achieve performance comparable with the current state-of-the-art variational methods. Moreover, we investigate how to generate novel samples and perform smooth interpolation in the binary latent space. To meet our objective, we propose a simplified deterministic model with a straight-through estimator to learn the binary latents and show its competitiveness with the latest VAE methods. Furthermore, we propose a novel method based on a random hyperplane rounding for sampling and smooth interpolation in the multivariate Bernoulli latent space. Although not a main objective, we demonstrate that our methods perform on par or better than the current state-of-the-art methods on common CelebA, CIFAR-10 and MNIST datasets. PyTorch code and trained models to reproduce published results will be released with the camera ready version.</p>